Next Article in Journal
Kaniadakis Functions beyond Statistical Mechanics: Weakest-Link Scaling, Power-Law Tails, and Modified Lognormal Distribution
Next Article in Special Issue
Probabilistic Pairwise Model Comparisons Based on Bootstrap Estimators of the Kullback–Leibler Discrepancy
Previous Article in Journal
Switching Adaptive Control with Applications on Robot Manipulators
Previous Article in Special Issue
Application of Statistical K-Means Algorithm for University Academic Evaluation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Some Information Measures Properties of the GOS-Concomitants from the FGM Family

1
Faculty of Mathematics and Computer Science, University of Bucharest, Academiei 14, 010014 Bucharest, Romania
2
“Gheorghe Mihoc-Caius Iacob” Institute of Mathematical Statistics and Applied Mathematics of Romanian Academy, Calea 13 Septembrie, 13, 050711 Bucharest, Romania
*
Author to whom correspondence should be addressed.
Entropy 2022, 24(10), 1361; https://doi.org/10.3390/e24101361
Submission received: 19 July 2022 / Revised: 20 September 2022 / Accepted: 21 September 2022 / Published: 26 September 2022
(This article belongs to the Special Issue Information and Divergence Measures)

Abstract

:
In this paper we recall, extend and compute some information measures for the concomitants of the generalized order statistics (GOS) from the Farlie–Gumbel–Morgenstern (FGM) family. We focus on two types of information measures: some related to Shannon entropy, and some related to Tsallis entropy. Among the information measures considered are residual and past entropies which are important in a reliability context.

1. Introduction

The notion of concomitants or induced order statistics arose in the early 1970s in the works of David [1] and Bhattacharya [2]. Briefly, when there is a sample from a bivariate distribution ordered by the first variate, the second variate paired with the r-th first variate is called the concomitant of the r-th-order statistic. Concomitants are important in situations in which are implied two characteristics and measuring one of them can influence the other. Therefore, they have applications in many fields such as selection procedures, inference problems, double sampling plans and systems reliability. For example, in [3,4] are studied from a reliability point of view complex systems with components which have two sub-components that performs different tasks, and in [5], the distribution theory of lifetimes of two component systems is discussed. In studies regarding the concomitants there are two elements that have to be mentioned: the kind of dependence between first and second variate, and the kind of order for the first variate. The majority of studies are based on the hypothesis of simple order statistics, but there are also studies that assume different kinds of orders such as as record values order or generalized order statistics.
Generalized order statistics (GOS) was introduced by Kamps [6] and it is a unifying concept for various types of order statistics such as simple order statistics, record values, sequential order statistics.
In this paper, we focus on the concomitants of GOS and with the dependence structure between the first variate and the second variate given by the Fairlie–Gumbel–Morgenstern (FGM) family. This family is a flexible family of bivariate distributions used as a modeling tool for bivariate data in many fields [7], one such field being Reliability, see [3,4,5]. The FGM family has a simple analytical form, but it can describe only relatively weak dependence because the correlation coefficient between the two components cannot exceed 1 / 3 . To prevent this limitation, extensions of FGM family have been proposed, for example, iterative FGM distributions or Huang–Kotz FGM distributions [8,9,10,11,12]. The results obtained in our paper will be generalized for these extensions of FGM family in a future work.
For the concomitants mentioned above, we recall and determine properties that some information measures have. The information measures that we deal with are in two categories, information measures related to Shannon entropy and information measures related to Tsallis entropy.
Since it was introduced in physics and adapted to information theory by Shannon in 1948, the concept of entropy has become more and more important in fields such as information theory, code theory, probability and statistics, reliability.
In probability and statistics, entropy measures the uncertainty associated to a random variable. Taking as a starting point Shannon entropy, a series of entropies have been defined as a generalization of it. For the concomitants of GOS from the FGM family, we will look at Shannon-derived and Tsallis-derived entropies, and our main aim is to determine Awad-type extensions for all the considered entropies, because Awad entropies do not have several drawbacks that Shannon entropy, for example, has: different systems with the same entropy, possible negative values for continuous distributions, different results in discrete and continuous case of linear random variable transformation, etc.
Furthermore, for the concomitants of GOS from FGM, we will determine not only entropies, but also other information measures such as Tsallis divergence and shift-invariant Fisher–Tsallis information number.
In the following sections, we recall some definitions and properties of GOS and their concomitants, in particular, when the bivariate distribution is in the FGM family. Then, we will discuss Shannon-type entropies, Tsallis-type entropies, Fisher information and divergences for concomitants of GOS from the FGM family. For these concomitants, in the last section, we will introduce new extensions and results on information measures.

2. Generalized Order Statistics and Their Concomitants for the FGM Family

2.1. Generalized Order Statistics

The concept of GOS was introduced by Kamps [6], in 1995, who proposed an unifying pattern of various order statistics:
Definition 1. 
The random variables X ( 1 , n , m ˜ , k ) ,…, X ( n , n , m ˜ , k ) are called GOS based on distribution function F with density function f, if their joint density function is given by:
f ( x 1 , , x n ) = k j = 1 n 1 γ j i = 1 n 1 ( 1 F ( x i ) ) m i f ( x i ) ( 1 F ( x i ) ) k 1 f ( x n )
on the cone F 1 ( 0 ) < x 1 x 2 x n < F 1 ( 1 ) of R n with parameters n N , n 2 , k > 0 , m ˜ = ( m 1 , , m n 1 ) , γ r = k + n r + j = r n 1 m j > 0 , for all r { 1 , 2 , , n 1 } .
Some particular cases of GOS are:
  • Simple order statistics with m 1 = m 2 = = m n 1 = 0 and k = 1 ;
  • Common record values with m 1 = m 2 = = m n 1 = 1 and k = 1 ;
  • Sequential order statistics with γ i = ( n i + 1 ) α i , α 1 , α 2 , , α n > 0 ;
  • Progressive type II censored order statistics based on censoring scheme ( R 1 , R 2 , , R n ) with γ n = k = R n + 1 , γ r = n r + 1 i = r n , 1 r n and m r = R r , 1 r n , [13,14].
As a result of the complex formula of the joint density, finding the marginal distributions of (1) is a difficult task, but in some particular cases, marginal densities can be found. In [6], the marginal densities are determined for m 1 = m 2 = m n 1 = m , and in [15] for γ 1 γ j , 1 i j n . In the following, we will suppose that m 1 = m 2 = m n 1 = m , i.e., we are in the m-GOS case where simple order statistics, record values and progressive type II censored order statistics with equi-balanced censoring scheme are included. Now, the density (1) becomes:
f ( x 1 , , x n ) = k j = 1 n 1 γ j i = 1 n 1 ( 1 F ( x i ) ) m f ( x i ) ( 1 F ( x i ) ) k 1 f ( x n )
on the cone F 1 ( 0 ) < x 1 x 2 x n < F 1 ( 1 ) of R n with parameters n N , n 2 , k > 0 , m, γ r = k + ( n r ) ( m + 1 ) > 0 , for all r { 1 , 2 , , n 1 } .
The marginal density function of the r-th GOS, r = 1 , 2 , , n , in this case, is given by [6]:
f r ( x ) = c r 1 ( r 1 ) ! ( 1 F ( x ) ) γ r 1 f ( x ) g m r 1 ( F ( x ) )
where c r 1 = j = 1 r γ j and for x [ 0 , 1 ) :
g m ( x ) = 1 m + 1 ( 1 ( 1 x ) m + 1 ) , m 1 log 1 1 x m = 1 .
Remark 1. 
For m = 0 and k = 1 , i.e., the case of simple order statistics, we have γ r = n r + 1 , c r 1 = γ 1 γ 2 γ r = n ( n 1 ) ( n r + 1 ) , g 0 ( F ( x ) ) = F ( x ) , and (3) becomes the well-known marginal density of the r-th-order statistic.
For m = 1 and k = 1 , i.e., the case of record values, (3) becomes
f r ( x ) = 1 ( r 1 ) ! f ( x ) [ ln ( 1 F ( x ) ) ] n 1 ,
the marginal density of the r-th record value.
For progressive type II censored order statistics with equi-balanced censoring scheme, the form of the marginal density is the same as the form of (3), with m = R , R being the removal number.

2.2. Concomitants

The term concomitant was introduced by David (1973) [1] and has the following definition:
Definition 2. 
Let ( X 1 , Y 1 ) , ( X 2 , Y 2 ) , , ( X n , Y n ) be iid bivariate random variables with cumulative distribution function F ( x , y ) . Then, the Y variate associated to the r-th-order statistic of X-s, X ( r : n ) , denoted by Y [ r : n ] , is the concomitant of X ( r : n ) .
A natural use of concomitants is in selection procedures when k individuals are chosen on the basis of their X-values. Then, the corresponding Y-values represent performance on an associated characteristics. In Reliability Theory, the role of the concomitants is emphasized in [3,4,5].

2.3. Concomitants of FGM Family

The FGM bivariate distribution family has a flexible form and it was studied by Farlie [16], Gumbel [17], Morgenstern [18], and Johnson and Kotz [19].
Definition 3. 
Let X and Y be two random variables with distribution functions F X and F Y , respectively. Additionally, let α be a real number. Then, the FGM family has the distribution function:
F X , Y ( x , y ) = F X ( x ) F Y ( y ) [ 1 + α ( 1 F X ( x ) ) ( 1 F Y ( y ) ) ] .
The corresponding probability density function (pdf) of (5) is:
f X , Y ( x , y ) = f X ( x ) f Y ( y ) [ 1 + α ( 1 F X ( x ) ) ( 1 F Y ( y ) ) ] ,
where f X ( x ) f Y ( y ) are the marginals of f X , Y ( x , y ) .
The parameter α [ 1 , 1 ] is known as the association parameter and the two random variables X and Y are independent when α = 0 . For α 0 , there is a dependence between the two variables, characterized by the FGM-copula whose properties were studied in [20].
Concomitants of FGM family, related to GOS, started to come into notice with the work of Beg and Ahsanullah in 2008 [21] where the density g [ r , n , m , k ] of the concomitant of the r-th GOS is derived:
g [ r , n , m , k ] ( y ) = f Y ( y ) + α ( 2 F Y ( y ) 1 ) f Y ( y ) C * ( r , n , m , k ) ,
where
C * ( r , n , m , k ) = 1 2 c r 1 ( γ 1 + 1 ) ( γ 2 + 1 ) ( γ r + 1 )
is a constant.
Remark 2. 
If m = 0 , k = 1 , then C * ( r , n , 0 , 1 ) = ( n 2 r + 1 ) / ( n + 1 ) and
g [ r , n , 0 , 1 ] ( y ) = f Y ( y ) α n 2 r + 1 n + 1 ( 2 F Y ( y ) 1 ) f Y ( y )
is the density of the concomitant of r-th-order statistic from the FGM family.
If m = 1 , k = 1 , then C * ( r , n , 1 , 1 ) = 1 2 1 r and
g [ r , n , 1 , 1 ] ( y ) = f Y ( y ) α ( 2 1 r 1 ) ( 2 F Y ( y ) 1 ) f Y ( y ) .
If we are in the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the density of the concomitant of r-th-order statistic from the FGM family is (7) with m = R , the removal number.
The cumulative distribution function and the survival function of the concomitant of r-th-order statistic can also be computed:
G [ r , n , m , k ] ( y ) = f Y ( y ) + α ( 1 F Y ( y ) ) f Y ( y ) C * ( r , n , m , k ) ,
G ¯ [ r , n , m , k ] ( y ) = 1 f Y ( y ) α ( 1 F Y ( y ) ) f Y ( y ) C * ( r , n , m , k ) .
In the following, in order to make it easier to read computations, we make the notations: Y [ r ] * = Y [ r , n , m , k ] , g [ r ] = g [ r , n , m , k ] , G [ r ] = G [ r , n , m , k ] , G ¯ [ r ] = G ¯ [ r , n , m , k ] , C r * = C * ( r , n , m , k ) .

3. Information Measures for the Concomitants from the FGM Family, Existing Results

In this section, we will recall some definitions and results for the information measures of the concomitants of GOS from the FGM family.

3.1. Shannon and Shannon-Related Entropies

Shannon entropy was introduced by Shannon in 1948 [22], it has multiple applications and it can be defined as:
H S ( X ) = E [ log f ( X ) ]
Information measures for concomitants derived from the FGM family have been studied by Tahmasebi and Behboodian: in [23] for concomitants of order statistics, and in [24] for concomitants of GOS. Using (7), they proved that for the Shannon entropy of Y [ r ] , the concomitant of the r-th generalized order statistics is:
H S ( Y [ r ] * ) = W ( r , α , n , m , k ) + H S ( Y ) ( 1 α C r * ) 2 α C r * ϕ f ( u ) ,
where
W ( r , α , n , m , k ) = 1 4 α C r * { ( 1 C r * α ) 2 log ( 1 C r * α ) ( 1 + C r * α ) 2 log ( 1 + C r * α ) } + 1 2 ,
and
ϕ f ( u ) = 0 1 log f Y ( F Y 1 ( u ) ) d u .
Remark 3. 
In [23] are analyzed the properties of (13) in the particular case when m = 0 and k = 1 , i.e., when the GOS reduces to the simple order statistics, and therefore, the entropy from (13) in this case is the Shannon entropy of the concomitant of r-th-order statistic:
H S ( Y [ r ] ) = W ( r , α , n , 0 , 1 ) + H S ( Y ) 1 + α n 2 r + 1 n + 1 + 2 α n 2 r + 1 n + 1 ϕ f ( u )
In [24], Shannon entropy for record values is also mentioned:
H S ( R [ r ] ) = W ( r , α , n , 1 , 1 ) + H S ( Y ) ( 1 + α ( 2 1 r 1 ) ) + 2 α ( 2 1 r 1 ) ϕ f ( u ) .
If we are in the case of progressive type II censoring order statistics with an equi-balanced censoring scheme, the Shannon entropy of the concomitant of r-th-order statistic from the FGM family is (13) with m = R , the removal number.
Awad, in 1987, ref. [25] noticed that Shannon entropy, in the continuous case, does not fulfill the condition that the entropy is preserved under the linear transformation and proposed the following entropy known also in the literature as Sup-entropy:
H S A ( X ) = E log f ( X ) δ
where δ = sup { f ( x ) | x R } . We will call this entropy Shannon–Awad entropy.
Residual and past Shannon entropies were defined in the context of reliability, being important in measuring the amount of information that a residual life or a past life of a unit has. In the following, the random variable X with pdf f, cdf F, and survival function F ¯ , is considered positive and it has the meaning of a lifetime of a unit.
Residual entropy is introduced and its properties are analyzed in the works of Ebrahimi [26] and Ebrahimi and Pellerey [27]. Residual entropy is based on the idea of measuring the expected uncertainty contained in the conditional density of X t given X > t [27]:
H S ( X ; t ) = E log f ( X ) F ¯ ( t ) | X > t .
In terms of failure rate, the residual entropy can be written as:
H S ( X ; t ) = 1 E log λ F ( X ) | X > t ,
where λ F ( · ) = f ( · ) / F ¯ ( · ) is the failure rate function.
Similar to the definition of the residual entropy, DiCrescenzo and Longobardi [28] introduced past entropy as a dual to the residual entropy. Past entropy measures the uncertainty about past life of a failed unit:
H ¯ S ( X ; t ) = E log f ( X ) F ( t ) | X < t .
In terms of reversed failure rate, past entropy can be written as:
H ¯ S ( X ; t ) = 1 E log τ F ( X ) | X < t ,
where τ F ( · ) = f ( · ) / F ( · ) is the reversed failure rate function.
Residual and past entropies for concomitants of GOS from the FGM family were determined by Mohie EL-Din et al. in [29]. They considered also concomitants of other types of GOS, but the form of the entropies is similar. The residual entropy of the r-th concomitant of GOS from the FGM family is [29]:
H S ( Y [ r ] * ; t ) = log G ¯ [ r ] ( t ) 1 G ¯ [ r ] ( t ) { ( 1 α C r * ) [ F ¯ Y ( t ) ( log F ¯ Y ( t ) H S ( Y ; t ) ) ] + + 2 α C r * ϕ f ( t ) + K 1 ( r , t , α , n , m , k ) } ,
where
K 1 ( r , t , α , n , m , k ) = 1 2 α C r * { 1 4 [ ( 1 + α C r * ) 2 ( 1 + α C r * ( 2 F Y ( t ) 1 ) ) 2 ] + + 1 2 [ ( 1 + α C r * ) 2 log ( 1 + α C r * ) ( 1 + α C r * ( 2 F Y ( t ) 1 ) ) 2 log ( 1 + α C r * ( 2 F Y ( t ) 1 ) ) ] } ,
and
ϕ f ( t ) = t F Y ( y ) f Y ( y ) log f Y ( y ) d y .
We notice that for t = 0 , the residual entropy (21) becomes the entropy (13).
Remark 4. 
For m = 0 and k = 1 , we obtain residual Shannon entropy for the concomitant of r-th-order statistic:
H S ( Y [ r ] ; t ) = log G ¯ [ r ] ( t ) 1 G ¯ [ r ] ( t ) { 1 α n 2 r + 1 n + 1 [ F ¯ Y ( t ) ( log F ¯ Y ( t ) H S ( Y ; t ) ) ] 2 α n 2 r + 1 n + 1 ϕ f ( t ) + K 1 ( r , t , α , n , 0 , 1 ) } .
For m = 1 and k = 1 , we obtain residual Shannon entropy for the concomitant of r-th record value:
H S ( R [ r ] ; t ) = log G ¯ [ r ] ( t ) 1 G ¯ [ r ] ( t ) { ( 1 α ( 2 1 r 1 ) ) [ F ¯ Y ( t ) ( log F ¯ Y ( t ) H S ( Y ; t ) ) ] 2 α ( 2 1 r 1 ) ϕ f ( t ) + K 1 ( r , t , α , n , 1 , 1 ) } .
In the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the residual Shannon entropy of the concomitant of r-th-order statistic from the FGM family is (21) with m = R , the removal number.
In a similar way, the past entropy for the concomitant of the r-th GOS from the FGM family is defined as [29]:
H ¯ S ( Y [ r ] * ; t ) = log G [ r ] ( t ) 1 G [ r ] ( t ) { ( 1 α C r * ) [ F Y ( t ) ( log F Y ( t ) H ¯ S ( Y ; t ) ) ] + + 2 α C r * ϕ ¯ f ( t ) + K 2 ( r , t , α , n , m , k ) } ,
where
K 2 ( r , t , α , n , m , k ) = 1 2 α C r * { 1 4 [ ( 1 + α C r * ( 2 F Y ( t ) 1 ) ) 2 ( 1 α C r * ) 2 ] + + 1 2 [ ( 1 + α C r * ( 2 F Y ( t ) 1 ) ) 2 log ( 1 + α C r * ( 2 F Y ( t ) 1 ) ) ( 1 α C r * ) 2 log ( 1 α C r * ) ] } ,
and
ϕ ¯ f ( t ) = 0 t F Y ( y ) f Y ( y ) log f Y ( y ) d y .
We notice that for t , the past entropy (26) becomes the entropy (13).

3.2. Tsallis and Tsallis-Related Entropies

For the first time, Tsallis entropy was introduced and used in the context of Cybernetics Theory by Harvda and Charvat [30], but it has become well known since its definition as a generalization of Boltzmann–Gibbs statistics, in the context of thermodynamics, by Tsallis in 1988 [31]. Being the starting point of the field of non-extensive statistics, Tsallis entropy is a non-additive generalization of the Shannon entropy and for a continuous random variable X with density function f, it can be defined as:
H T ( X ) = 1 q 1 1 + [ f ( x ) ] q d x q > 0 , q 1 .
When q 1 , Tsallis entropy approaches to Shannon entropy. Tsallis entropy has, in turn, various generalizations, see, for example, [32].
Another important element in non-extensive statistics is log q function:
log q x = x 1 q 1 1 q , x > 0 , q 1 ,
and Tsallis entropy can be obtained using this function in two ways:
H T ( X ) = E log q 1 f ( X ) = 1 q 1 E 1 [ f ( X ) ] q 1 .
Tsallis entropy has applications in many fields, from statistical mechanics and thermodynamics, to image processing and reliability, sometimes being more suited to measuring uncertainty than classical Shannon entropy [33,34].
In [35], Tsallis entropy was computed and its properties obtained for record values and their concomitants when the bivariate distribution is in the FGM family.
Similar to the Shannon case, we can think about residual and past variants of the Tsallis entropy in the context of reliability. In [36], Nanda and Paul introduced residual Tsallis entropy as the ’first kind residual entropy of order β ’. In our notation, β is q:
H T ( X ; t ) = 1 q 1 1 t f ( x ) F ¯ ( t ) q d x = 1 q 1 1 1 [ F ¯ ( t ) ] q t [ f ( x ) ] q d x .
In addition to entropy type information measures, there are another two types information measures that can be associated to probability distributions—Fisher measures and divergence measures [37].

3.3. Fisher Information Number

Fisher information measures the amount of information that we can obtain from a sample about an unknown parameter and therefore, it measures the uncertainty included in a unknown characteristic of a population. If the parameter is a location one, then Fisher information is shift-invariant and has the form:
I f = + x log f ( x ) 2 f ( x ) d x = E x log f ( x ) 2 .
Shift-invariant Fisher information, also called Fisher Information Number (FIN), was studied in [38]. It has applications in statistical physics where it is also known by the name extreme physical information [39], and it is used in analyzing the evolution of dynamical systems.
For the concomitants of GOS from the FGM family, the Fisher information number was determined in [40].

3.4. Divergence Measures

Divergences are useful tools when a measure of the difference between two probability distributions is needed and therefore, they have applications in various fields, from inference for Markov chains, [41,42,43] to machine learning [44,45]. One of the best-known divergence is Kullback–Leibler divergence [46,47], which for two continuous random variables Z 1 , with probability density f 1 , and Z 2 with probability density f 2 is:
K L D ( Z 1 , Z 2 ) = + f 1 ( z ) l o g f 1 ( z ) f 2 ( z ) d z .
Kullback–Leibler divergence for the concomitants of GOS from the FGM family was computed in [24] and the result is distribution-free.
One of the generalizations of Kullback–Leibler divergence measure is Tsallis divergence, which expands Kullback–Leibler divergence in a similar way to that in which Tsallis entropy extends Shannon entropy. There is a very rich literature on Tsallis divergence or Tsallis relative entropy in the case of the discrete distributions, see, for example, [48,49,50]. Tsallis divergence for continuous distributions does not appear so frequently in the literature, being studied mainly in Machine Learning context, [44,45]. Tsallis divergence for the concomitants will be determined in this paper in the next section.

4. Information Measures for the Concomitants from FGM Family, New Results

In this section, we will provide some generalizations of the existing results on the information measures for the concomitants of GOS from the FGM family, results that are mentioned in previous section. We are interested in Awad-type extensions of the entropies, in residual and past Tsallis entropies, in Tsallis type extension of the FIN, and in Tsallis divergence.

4.1. Shannon and Shannon-Related Entropies

One can easily notice that the relationship between Shannon–Awad entropy (18) and Shannon entropy (12) is:
H S A ( X ) = H S ( X ) + log δ .
In the following, we provide natural extensions of the results obtained in [24], considering Shannon–Awad entropy instead of Shannon entropy. Thus, Shannon–Awad entropy of the concomitant of the r-th GOS from the FGM family is:
H S A ( Y [ r ] * ) = W ( r , α , n , m , k ) + ( H S A ( Y ) log δ ) ( 1 α C r * ) 2 α C r * ϕ f ( u ) + log δ [ r ] ,
where W ( r , α , n , m , k ) and ϕ f are given by (14) and (15) and
δ = sup { f Y ( x ) | x > 0 } , δ [ r ] = sup { g [ r ] ( x ) | x > 0 } .
Remark 5. 
For the simple OS, the r-th OS concomitant from the FGM family, Shannon–Awad entropy is:
H S A ( Y [ r ] ) = W ( r , α , n , 0 , 1 ) + ( H S A ( Y ) log δ ) 1 + α n 2 r + 1 n + 1 + + 2 α n 2 r + 1 n + 1 ϕ f ( u ) + log δ [ r ] ,
with δ [ r ] = sup { g [ r ] ( x ) | x > 0 } and g [ r ] being here the pdf of the concomitant r-th-order statistics.
For the record values, the r-th record value concomitant from the FGM family, Shannon–Awad entropy is:
H S A ( R [ r ] ) = W ( r , α , n , 1 , 1 ) + ( H S A ( Y ) log δ ) 1 + α ( 2 1 r 1 ) + + 2 α ( 2 1 r 1 ) ϕ f ( u ) + log δ [ r ]
with δ [ r ] = sup { g [ r ] ( x ) | x > 0 } and g [ r ] being here the pdf of the concomitant r-th record value.
In the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the Shannon–Awad entropy of the concomitant of r-th-order statistic from the FGM family is (35) with m = R , the removal number.
An extension of the above entropies are residual and past Shannon–Awad entropies. We define residual Shannon–Awad entropy as:
H S A ( X ; t ) = E log f ( X ) F ¯ ( t ) δ ( t , ) F ¯ | X > t ,
where
δ ( t , ) F ¯ = 1 F ¯ ( t ) sup { f ( x ) | x ( t , ) } .
In terms of failure rate, residual Shannon–Awad entropy can be obtained:
H S A ( X ; t ) = 1 E [ log λ ( X ) | X > t ] + log δ ( t , ) F ¯ .
We notice that the relationship between residual Shannon entropy and residual Shannon–Awad entropy is similar to (34) and it is:
H S A ( X ; t ) = H S ( X ; t ) + log δ ( t , ) F ¯ .
In a similar way, we can extend past Shannon entropy to past Shannon–Awad entropy:
H ¯ S A ( X ; t ) = E log f ( X ) F ( t ) δ ( 0 , t ) F | X < t ,
where
δ ( 0 , t ) F = 1 F ( t ) sup { f ( x ) | x ( 0 , t ) } .
As a function of reversed failure rate, past Shannon–Awad entropy can be written:
H ¯ S A ( X ; t ) = 1 E [ log τ ( X ) | X < t ] + log δ ( 0 , t ) F .
We can write also the relationship between past Shannon–Awad entropy and past Shannon entropy:
H ¯ S A ( X ; t ) = H ¯ S ( X ; t ) + log δ ( 0 , t ) F .
Taking into account the above relationships, (21), and (26), we can obtain the Awad-type extension of the Shannon entropy for the concomitant of r-th GOS from the FGM family, when the concomitant represents the residual life or the past life of a unit.
Theorem 1. 
Residual Shannon–Awad entropy for the concomitant of r-th GOS from the FGM family is:
H S A ( Y [ r ] * ; t ) = log G ¯ [ r ] ( t ) 1 G ¯ [ r ] ( t ) { ( 1 α C r * ) [ F ¯ Y ( t ) ( log F ¯ Y ( t ) H S A ( Y ; t ) + δ t , F ¯ Y ) ] + + 2 α C r * ϕ f ( t ) + K 1 ( r , t , α , n , m , k ) } + log δ ( t , ) G ¯ [ r ]
where K 1 ( r , t , α , n , m , k ) and ϕ f ( t ) are given by (22), (23) respectively, and
δ ( t , ) F ¯ Y = 1 F ¯ Y ( t ) sup { f Y ( x ) | x ( t , ) } , δ ( t , ) G ¯ [ r ] = 1 G ¯ [ r ] ( t ) sup { g [ r ] ( x ) | x ( t , ) } .
Past Shannon–Awad entropy for the concomitant of r-th GOS from the FGM family is:
H ¯ S A ( Y [ r ] * ; t ) = log G [ r ] ( t ) 1 G [ r ] ( t ) { ( 1 α C r * ) [ F Y ( t ) ( log F Y ( t ) H ¯ S A ( Y ; t ) + δ ( 0 , t ) F Y ) ] + + 2 α C r * ϕ ¯ f ( t ) + K 2 ( r , t , α , n , m , k ) } + log δ ( 0 , t ) G [ r ]
where K 2 ( r , t , α , n , m , k ) and ϕ ¯ f ( y ) , and δ ( 0 , t ) are given by (22) and (23) respectively, and
δ ( 0 , t ) F Y = 1 F Y ( t ) sup { f Y ( x ) | x ( 0 , t ) } , δ ( 0 , t ) G [ r ] = 1 G [ r ] ( t ) sup { g [ r ] ( x ) | x ( 0 , t ) } .
Corollary 1. 
The residual Shannon–Awad entropy for the concomitant of r-th-order statistic is:
H S A ( Y [ r ] ; t ) = log G ¯ [ r ] ( t ) 1 G ¯ [ r ] ( t ) { 1 + α n 2 r + 1 n + 1 [ F ¯ Y ( t ) ( log F ¯ Y ( t ) H S A ( Y ; t ) + δ t , F ¯ Y ) ] + 2 α n 2 r + 1 n + 1 ϕ f ( t ) + K 1 ( r , t , α , n , 0 , 1 ) } + log δ ( t , ) G ¯ [ r ] .
The residual Shannon–Awad entropy for the concomitant of r-th record value is:
H S A ( R [ r ] ; t ) = log G ¯ [ r ] ( t ) 1 G ¯ [ r ] ( t ) { ( 1 + α ( 2 1 r 1 ) ) [ F ¯ Y ( t ) ( log F ¯ Y ( t ) H S A ( Y ; t ) + δ t , F ¯ Y ) ] + 2 α ( 2 1 r 1 ) ϕ f ( t ) + K 1 ( r , t , α , n , 1 , 1 ) } + log δ ( t , ) G ¯ [ r ] .
In the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the residual Shannon–Awad entropy of the concomitant of r-th-order statistic from the FGM family is (46) with m = R , the removal number.
Similar results can be obtained also for past Shannon–Awad entropy.

4.2. Tsallis and Tsallis-Related Entropies

Information measures related to Tsallis entropy for the concomitants are very few in the literature. In [35], Tsallis entropy and residual Tsallis entropy for the concomitants of the record values from the FGM family are obtained. In this subsection, we will obtain more general results, computing Tsallis entropies for the concomitants of generalized order statistics and, furthermore, considering Awad-type extensions of the Tsallis entropies.
Theorem 2. 
Tsallis entropy for the concomitant of the r-th GOS from the FGM family is:
H T ( Y [ r ] * ) = 1 q 1 1 k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k C r * k E U f Y ( F Y 1 ( U ) ) q 1 U k s ,
where U is an U ( 0 , 1 ) random variable and E U is the expectation of f Y ( F Y 1 ( U ) ) q 1 U k s .
Proof. 
Taking into account the definitions of Tsallis entropy (29) and the density of the concomitants (6), we obtain:
H T ( Y [ r ] * ) = 1 q 1 1 0 [ g [ r ] ( y ) ] q d y = = 1 q 1 1 0 [ f Y ( y ) ] q ( 1 + α C r * ( 2 F Y ( y ) 1 ) ) q d y .
We have that:
( 1 + α C r * ( 2 F Y ( y ) 1 ) ) q = k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k C r * k [ F Y ( y ) ] k s .
Additionally, if we consider the transformation
F Y ( y ) = u ; y = F Y 1 ( u ) , f Y ( y ) d y = d u ,
the result (50) follows. □
Corollary 2. 
The Tsallis entropy for the concomitant of r-th-order statistic is:
H T ( Y [ r ] ) = 1 q 1 { 1 k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k ( 1 ) k n 2 r + 1 n + 1 k × × E U f Y ( F Y 1 ( U ) ) q 1 U k s } .
The Tsallis entropy for the concomitant of r-th record value is:
H T ( R [ r ] ) = 1 q 1 { 1 k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k ( 1 ) k 2 1 r 1 k × × E U f Y ( F Y 1 ( U ) ) q 1 U k s } .
In the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the Tsallis entropy of the concomitant of r-th-order statistic from the FGM family is (50) with m = R , the removal number.
We now discuss some Tsallis-related entropies. First, we give an Awad-type extension of Tsallis entropy and then we focus on Residual Tsallis, Past Tsallis entropies and their Awad-type extensions.
Several Awad-type extensions have been proposed in the literature ([51,52]). Now, we introduce this type of extension for Tsallis entropy and we define Tsallis–Awad entropy for a continuous random variable X which take values in R :
H T A ( X ) = 1 q 1 1 + f ( x ) δ q 1 f ( x ) d x = 1 q 1 1 1 δ q 1 + [ f ( x ) ] q d x ,
where δ = sup { f ( x ) | x R } .
We notice that the relationship between Tsallis–Awad entropy and Tsallis entropy is:
H T A ( X ) = δ 1 q H T ( X ) + log q δ .
Using (50) and (54), we can obtain the expression of Tsallis–Awad entropy for the concomitant of the r-th GOS from the FGM family:
Theorem 3. 
Tsallis–Awad entropy for the concomitant of the r-th GOS from the FGM family is:
H T A ( Y [ r ] * ) = δ 1 q q 1 1 k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k C r * k E U f Y ( F Y 1 ( U ) ) q 1 U k s + log q δ ,
where U is an U ( 0 , 1 ) random variable, and, in this case, δ = sup { g [ r ] ( x ) | x > 0 } .
Corollary 3. 
The Tsallis–Awad entropy for the concomitant of r-th-order statistic is:
H T ( Y [ r ] ) = δ 1 q q 1 { 1 k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k ( 1 ) k n 2 r + 1 n + 1 k × × E U f Y ( F Y 1 ( U ) ) q 1 U k s } + log q δ .
The Tsallis–Awad entropy for the concomitant of r-th record value is:
H T ( R [ r ] ) = δ 1 q q 1 { 1 k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k ( 1 ) k 2 1 r 1 k × × E U f Y ( F Y 1 ( U ) ) q 1 U k s } + log q δ .
In the case of progressve type II censoring order statistics with equi-balanced censoring scheme, the Tsallis–Awad entropy of the concomitant of r-th-order statistic from the FGM family is (55) with m = R , the removal number.
In a similar way to the definition of residual Tsallis (31), we can consider the past Tsallis entropy:
H ¯ T ( X ; t ) = 1 q 1 1 0 t f ( x ) F ( t ) q d x = 1 q 1 1 1 F ( t ) q 0 t [ f ( x ) ] q d x .
Taking into account Theorem 2, the following theorem is naturally deduced:
Theorem 4. 
Residual Tsallis entropy for the concomitant of the r-th GOS from the FGM family is:
H T ( Y [ r ] * ; t ) = 1 q 1 { 1 1 G ¯ [ r ] ( t ) q k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k C r * k × × E U f Y ( F 1 ( U ) ) q 1 U k s | U > F Y ( t ) } ,
where U is an U ( 0 , 1 ) random variable and E U is the conditional expectation of f Y ( F 1 ( U ) ) q 1 U k s , given U > F Y ( t ) .
Past Tsallis entropy for the concomitant of the r-th GOS from the FGM family is:
H ¯ T ( Y [ r ] * ; t ) = 1 q 1 { 1 1 G [ r ] ( t ) q k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k C r * k × × E U f Y ( F Y 1 ( U ) ) q 1 U k s | U < F Y ( t ) } ,
where U is a U ( 0 , 1 ) random variable and E U is the conditional expectation of f Y ( F 1 ( U ) ) q 1 U k s , given U < F Y ( t ) .
Corollary 4. 
The residual Tsallis entropy for the concomitant of r-th-order statistic is:
H T ( Y [ r ] ; t ) = 1 q 1 { 1 1 G ¯ [ r ] ( t ) q k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k ( 1 ) k n 2 r + 1 n + 1 k × × E U f Y ( F 1 ( U ) ) q 1 U k s | U > F Y ( t ) } ,
The residual Tsallis entropy for the concomitant of r-th record value is:
H T ( R [ r ] ; t ) = 1 q 1 { 1 1 G ¯ [ r ] ( t ) q k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k ( 1 ) k 2 1 r 1 k × × E U f Y ( F 1 ( U ) ) q 1 U k s | U > F Y ( t ) } .
In the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the residual Tsallis entropy of the concomitant of r-th-order statistic from the FGM family is (59) with m = R , the removal number.
Similar results can be obtained for past Tsallis entropy.
We now introduce Residual Tsallis–Awad entropy:
H T A ( X ; t ) = 1 q 1 1 1 δ ( t , + ) F ¯ q 1 t + f ( x ) F ¯ ( t ) q d x
= 1 q 1 1 1 F ¯ ( t ) E f ( X ) F ¯ ( t ) δ ( t , ) F ¯ q 1 | X > t ,
where δ ( t , ) F ¯ = 1 F ¯ ( t ) sup { f ( x ) | x ( t , + ) } .
Past Tsallis–Awad entropy can also be defined:
H ¯ T A ( X ; t ) = 1 q 1 1 1 δ ( 0 , t ) F q 1 0 t f ( x ) F ( t ) q d x
= 1 q 1 1 1 F ( t ) E f ( X ) F ( t ) δ ( 0 , t ) F q 1 | 0 < X < t .
where δ ( 0 , t ) F = 1 F ( t ) sup { f ( x ) | x ( 0 , t ) } .
We notice that a similar relationship to (54) can be written for residual Tsallis entropies and for past Tsallis entropies:
H T A ( X ; t ) = δ ( t , ) F ¯ 1 q H T ( X ; t ) + log q δ ( t , ) F ¯ ,
H ¯ T A ( X ; t ) = δ ( 0 , t ) F 1 q H ¯ T ( X ; t ) + log q δ ( 0 , t ) F ,
and the following theorem can be proven:
Theorem 5. 
Residual Tsallis–Awad entropy for the concomitant of the r-th GOS from the FGM family is:
H T A ( Y [ r ] * ; t ) = ( δ ( t , ) G ¯ [ r ] ) 1 q q 1 { 1 1 G ¯ [ r ] ( t ) q k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k C r * k × × E U f Y ( F Y 1 ( U ) ) q 1 U k s | U > F Y ( t ) } + log q δ ( t , ) G ¯ [ r ] ,
where U is an U ( 0 , 1 ) random variable, and δ ( t , ) G ¯ [ r ] = 1 G ¯ [ r ] ( t ) sup { g [ r ] ( x ) | x ( t , + ) } .
Past Tsallis–Awad entropy for the concomitant of the r-th GOS from the FGM family is:
H ¯ T A ( Y [ r ] * ; t ) = ( δ ( 0 , t ) G [ r ] ( t ) ) 1 q q 1 { 1 1 G [ r ] ( t ) q k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k C r * k × × E U f Y ( F Y 1 ( U ) ) q 1 U k s | U < F Y ( t ) } + log q δ ( 0 , t ) G [ r ] ,
where U is an U ( 0 , 1 ) random variable, and δ ( 0 , t ) G [ r ] = 1 G [ r ] sup { g [ r ] ( x ) | x ( 0 , t ) } .
Corollary 5. 
The residual Tsallis–Awad entropy for the concomitant of r-th-order statistic is:
H T A ( Y [ r ] ; t ) = ( δ ( t , ) G ¯ [ r ] ) 1 q q 1 { 1 1 G ¯ [ r ] ( t ) q k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k ( 1 ) k n 2 r + 1 n + 1 k × × E U f Y ( F Y 1 ( U ) ) q 1 U k s | U > F Y ( t ) } + log q δ ( t , ) G ¯ [ r ] .
The residual Tsallis–Awad entropy for the concomitant of r-th record value is:
H T A ( Y [ r ] ; t ) = ( δ ( t , ) G ¯ [ r ] ) 1 q q 1 { 1 1 G ¯ [ r ] ( t ) q k = 0 q s = 0 k ( 1 ) s q k k s 2 k s α k ( 1 ) k 2 1 r 1 k × × E U f Y ( F Y 1 ( U ) ) q 1 U k s | U > F Y ( t ) } + log q δ ( t , ) G ¯ [ r ] .
In the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the residual Tsallis entropy of the concomitant of r-th-order statistic from the FGM family is (59) with m = R , the removal number.
Similar results can be obtained for past Tsallis entropy.

4.3. Fisher–Tsallis Information Number

Various generalizations of FIN have been proposed, see, for example, [53,54,55]. In [53], the FIN is generalized, replacing the expectation and the logarithm functions with their q variants, and in [54], a ( β , q ) -Fisher information is defined. We here consider the following extension FIN, which we call Fisher–Tsallis information number:
I f = E f x log q f ( x ) 2 ,
where log q is given by (30). This extension is a type of extension from [54], with β = 2 and q = 1 .
For the concomitants of the GOS from FGM family, we have the following theorem which can be seen as an extension of the results obtained in [40]:
Theorem 6. 
For the r-th concomitant Y [ r ] * of GOS from a FGM family, the Tsallis–Fisher information number for a location parameter is:
I g [ r ] = I 1 + I 2 + I 3 ,
where
I 1 = E U ( f Y ( F 1 ( U ) ) ) 2 q ( f Y ( F Y 1 ( U ) ) ) 2 ( 1 + C r * α ( 1 2 U ) ) 2 2 q , I 2 = 4 C r * 2 α 2 · E U ( f Y ( F 1 ( U ) ) ) 4 2 q ( 1 + C r * α ( 1 2 U ) ) 2 q , I 3 = 4 C r * 2 α · E U f Y ( F 1 ( U ) ) ) 2 2 q f Y ( F Y 1 ( U ) ) ( 1 + C r * α ( 1 2 U ) ) 1 2 q .
Proof. 
From (73), we have
I g [ r ] = E g [ r ] ( g [ r ] ( Y ) ) 2 q ( g [ r ] ( Y ) ) 2 .
Using the expression (7) for the density g [ r ] , it results in
I g [ r ] = E g [ r ] [ f Y ( Y ) 2 q 1 + C r * α ( 1 2 F Y ( Y ) ) 2 q f Y ( Y ) ( 1 + C r * α ( 1 2 F Y ( Y ) ) ) 2 ( f Y ( Y ) ) 2 C r * α 2 .
Thus,
I g [ r ] = E g [ r ] [ ( f Y ( Y ) ) 2 q 1 + C r * α ( 1 2 F Y ( Y ) ) 2 q · [ ( f Y ( Y ) ) 2 ( 1 + C r * α ( 1 2 F Y ( Y ) ) ) 2 + 4 ( f Y ( Y ) ) 4 C r * 2 α 2 4 f Y ( Y ) ( f Y ( Y ) ) 2 C r * α ( 1 + C r * α ( 1 2 F Y ( Y ) ) ) ] ] .
After some computations,
I g [ r ] = E g [ r ] ( f Y ( Y ) ) 2 q ( f Y ( Y ) ) 2 ( 1 + C r * α ( 1 2 F Y ( Y ) ) ) 2 2 q + + 4 C r * 2 α 2 E g [ r ] f Y ( Y ) 4 2 q ( 1 + C r * α ( 1 2 F Y ( Y ) ) ) 2 q 4 C r * α E g [ r ] ( f Y ( Y ) ) 2 2 q f Y ( Y ) ( 1 + C r * α ( 1 2 F Y ( Y ) ) ) 1 2 q .
After the transformation U = F Y ( Y ) , with U U ( 0 , 1 ) we obtain (74). □

4.4. Tsallis Divergence

We consider Tsallis divergence for two densities, f 1 and f 2 , as it is defined in [44]:
T D ( Z 1 , Z 2 ) = + f 1 ( z ) l o g q f 2 ( z ) f 1 ( z ) d z
that can also be expressed as:
T D ( Z 1 , Z 2 ) = 1 1 q E f 1 f 1 ( z ) f 2 ( z ) q 1 1 .
We notice that this divergence is the divergence considered in [36], with ϕ ( x ) = log q ( 1 / x ) , it is the divergence analyzed in [56], with k = 1 q , and it is Tsallis Relative Entropy from [57].
When q 1 , Tsallis entropy becomes Shannon entropy and also Tsallis divergence becomes Kullback–Leibler divergence (33). The next theorem generalizes the results from [23], computing Tsallis divergence for two concomitants of GOS from the FGM family.
Theorem 7. 
Let Y [ r ] and Y [ s ] the r-th and the s-th concomitants of the GOS from the FGM family with the densities g [ r ] and g [ s ] . Then, the Tsallis divergence of g [ s ] from g [ r ] has the following form:
T D ( Y [ r ] , Y [ s ] ) = 1 1 q [ D 1 D 2 1 ] ,
where
D 1 = 1 q ( q + 1 ) C r * C r * C s * q 1 ( 1 + C r * α ) q + 1 F 1 q + 1 , q 1 , q + 1 , C s * ( 1 + C r * α ) C r * C s * ,
D 2 = 1 q ( q + 1 ) C r * C r * C s * q 1 ( 1 C r * α ) q + 1 F 1 q + 1 , q 1 , q + 1 , C s * ( 1 C r * α ) C r * C s * ,
with F 1 being the hypergeometric function.
Proof. 
In (76), we replace f 1 and f 2 with the concomitants densities:
g [ r ] ( y ) = f Y ( y ) [ 1 + C r * α ( 2 F y ( y ) 1 ) ] ,
g [ s ] ( y ) = f Y ( y ) [ 1 + C s * α ( 2 F y ( y ) 1 ) ] ,
and we compute the expectation:
E g [ r ] g [ r ] ( Y ) g [ s ] ( Y ) q 1 = 0 g [ r ] ( y ) g [ r ] ( Y ) g [ s ] ( Y ) q 1 d y = 0 f Y ( y ) [ 1 + C r * α ( 2 F Y ( y ) 1 ) ] f Y ( y ) [ 1 + C r * α ( 2 F Y ( y ) 1 ) ] f Y ( y ) [ 1 + C s * α ( 2 F Y ( y ) 1 ) ] q 1 d y .
First, we make the transformation:
F Y ( y ) = u , y = F Y 1 ( u ) , f Y ( y ) d y = d u ,
and we obtain
E g [ r ] g [ r ] ( Y ) g [ s ] ( Y ) q 1 = 0 1 [ 1 + C r * α ( 2 u 1 ) ] q [ 1 + C s * α ( 2 u 1 ) ] q 1 d u .
Then, we make the transformation:
2 u 1 = v , u = ( v + 1 ) / 2 , 2 d u = d v ,
and we obtain:
E g [ r ] g [ r ] ( Y ) g [ s ] ( Y ) q 1 = 1 2 1 1 [ 1 + C r * α v ] q [ 1 + C s * α v ] q 1 d v .
We use the general formula:
1 2 1 1 ( 1 + a x ) A ( 1 + b x ) B d x = = 1 A ( A + 1 ) ( 1 + a ) A + 1 ( 1 + b ) B a ( 1 + b ) a b B F 1 A + 1 , B , A + 1 , b ( 1 + a ) a b ( 1 a ) A + 1 ( 1 b ) B a ( 1 b ) a b B F 1 A + 1 , B , A + 1 , b ( 1 a ) a b ,
where F 1 is the hypergeometric function. It results in:
E g [ r ] g [ r ] ( Y ) g [ s ] ( Y ) q 1 = D 1 D 2 .

5. Conclusions

This paper is focused on information measures related to Shannon entropy, Tsallis entropy, Fisher information, and divergences for the concomitants of GOS from the FGM family. We review the literature on the mentioned information measures and we generalize existing results. The study of the concomitants, pairs of the order statistics in a sample from a bivariate distribution, ordered by one variate, could have applications in reliability, for example, in the analysis of the lifetime uncertainty of complex systems. For this reason, we also discuss residual and past versions of the entropies. Considering generalized order statistics (GOS) results in an increasing complexity of computations, but it gives a general form of the computed measures that can be applied for the concomitants of various order statistics.

Author Contributions

All authors contributed equally to the paper. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors are grateful to the referees for their comments and suggestions which improved the presentation of this paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. David, H.A. Concomitants of order statistics. Bull. Inst. Int. Statist. 1973, 45, 295–300. [Google Scholar]
  2. Bhattacharya, P. Convergence of sample paths of normalized sums of induced order statistics. Ann. Stat. 1974, 2, 1034–1039. [Google Scholar] [CrossRef]
  3. Bayramoglu, I. Reliability and mean residual life of complex systems with two dependent components per element. IEEE Trans. Reliab. 2013, 62, 276–285. [Google Scholar] [CrossRef]
  4. Ozkut, M. The (nk + 1)-out-of-n concomitant system having m subcomponents and its reliability. J. Comput. Appl. Math. 2021, 386, 113251. [Google Scholar] [CrossRef]
  5. Scaria, J.; Mohan, S. Dependence concepts and Reliability application of concomitants of order statistics from the Morgenstern family. J. Stat. Theory Appl. 2021, 20, 193–203. [Google Scholar] [CrossRef]
  6. Kamps, U. A concept of generalized order statistics. J. Stat. Plan. Inference 1995, 48, 1–23. [Google Scholar] [CrossRef]
  7. Balakrishnan, N.; Lai, C.D. Continuous Bivariate Distributions; Springer Science + Business Media: Berlin, Germany, 2009. [Google Scholar]
  8. Kotz, S.; Johnson, N.L. Propriétés de dépendance des distributions itérées généralisées à deux variables Farlie-Gumbel-Morgenstern. C. R. Acad. Sci. Paris A 1977, 285, 277–280. [Google Scholar]
  9. Huang, J.; Kotz, S. Correlation structure in iterated Farlie-Gumbel-Morgenstern distributions. Biometrika 1984, 71, 633–636. [Google Scholar]
  10. Huang, J.S.; Kotz, S. Modifications of the Farlie-Gumbel-Morgenstern distributions. A tough hill to climb. Metrika 1999, 49, 135–145. [Google Scholar] [CrossRef]
  11. Bairamov, I.; Kotz, S.; Bekci, M. New generalized Farlie-Gumbel-Morgenstern distributions and concomitants of order statistics. J. Appl. Stat. 2001, 28, 521–536. [Google Scholar] [CrossRef]
  12. Bairamov, I.; Kotz, S. Dependence structure and symmetry of Huang-Kotz FGM distributions and their extensions. Metrika 2002, 56, 55–72. [Google Scholar] [CrossRef]
  13. Aggarwala, R.; Balakrishnan, N. Recurrence relations for single and product moments of progressive Type-II right censored order statistics from exponential and truncated exponential distributions. Ann. Inst. Stat. Math. 1996, 48, 757–771. [Google Scholar] [CrossRef]
  14. Balakrishnan, N.; Cramer, E. The Art of Progressive Censoring; Statistics for Industry and Technology; Birkhäuser: Basel, Switzerland; Springer Science + Business Media: New York, NY, USA, 2014. [Google Scholar]
  15. Cramer, E.; Kamps, U. Marginal distributions of sequential and generalized order statistics. Metrika 2003, 58, 293–310. [Google Scholar] [CrossRef]
  16. Farlie, D.J.G. The performance of some correlation coefficients for a general bivariate distribution. Biometrika 1960, 47, 307–323. [Google Scholar] [CrossRef]
  17. Gumbel, E. Bivariate exponential distributions. J. Am. Statist. Assoc. 1960, 55, 698–707. [Google Scholar] [CrossRef]
  18. Morgenstern, D. Einfache Beispiele Zweidimensionaler Verteilungen. Mitt. Math. Stat. 1956, 8, 234–235. [Google Scholar]
  19. Johnson, N.L.; Kotz, S. On some generalized Farlie-Gumbel-Morgenstern distributions-II regression, correlation and further generalizations. Commun. Stat.-Theory Methods 1977, 6, 485–496. [Google Scholar] [CrossRef]
  20. Nelsen, R.B. An Introduction to Copulas; Springer Science + Business Media: New York, NY, USA, 2007. [Google Scholar]
  21. Beg, M.; Ahsanullah, M. Concomitants of generalized order statistics from Farlie–Gumbel–Morgenstern distributions. Stat. Methodol. 2008, 5, 1–20. [Google Scholar] [CrossRef]
  22. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  23. Tahmasebi, S.; Behboodian, J. Information properties for concomitants of order statistics in Farlie–Gumbel–Morgenstern (FGM) family. Commun. Stat.-Theory Methods 2012, 41, 1954–1968. [Google Scholar] [CrossRef]
  24. Tahmasebi, S.; Behboodian, J. Shannon information for concomitants of generalized order statistics in Farlie–Gumbel–Morgenstern (FGM) family. Bull. Malays. Math. Sci. Soc. 2012, 35, 975–981. [Google Scholar]
  25. Awad, A. A statistical information measure. Dirasat (Sci.) 1987, 14, 7–20. [Google Scholar]
  26. Ebrahimi, N.; Pellerey, F. New partial ordering of survival functions based on the notion of uncertainty. J. Appl. Probab. 1995, 32, 202–211. [Google Scholar] [CrossRef]
  27. Ebrahimi, N. How to measure uncertainty in the residual life time distribution. Sankhyā Indian J. Stat. Ser. A 1996, 58, 48–56. [Google Scholar]
  28. Di Crescenzo, A.; Longobardi, M. Entropy-based measure of uncertainty in past lifetime distributions. J. Appl. Probab. 2002, 39, 434–440. [Google Scholar] [CrossRef]
  29. Mohie EL-Din, M.M.; Amein, M.M.; Ali, N.S.A.; Mohamed, M.S. Residual and past entropy for concomitants of ordered random variables of Morgenstern family. J. Probab. Stat. 2015, 2015, 1–6. [Google Scholar] [CrossRef]
  30. Havrda, J.; Charvát, F. Quantification method of classification processes. Concept of structural a-entropy. Kybernetika 1967, 3, 30–35. [Google Scholar]
  31. Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  32. Hirica, I.E.; Pripoae, C.L.; Pripoae, G.T.; Preda, V. Weighted Relative Group Entropies and Associated Fisher Metrics. Entropy 2022, 24, 120. [Google Scholar] [CrossRef]
  33. Wilk, G.; Włodarczyk, Z. Example of a possible interpretation of Tsallis entropy. Phys. A Stat. Mech. Its Appl. 2008, 387, 4809–4813. [Google Scholar] [CrossRef]
  34. Calì, C.; Longobardi, M.; Ahmadi, J. Some properties of cumulative Tsallis entropy. Phys. A Stat. Mech. Its Appl. 2017, 486, 1012–1021. [Google Scholar] [CrossRef]
  35. Paul, J.; Thomas, P.Y. Tsallis entropy properties of record values. Calcutta Stat. Assoc. Bull. 2015, 67, 47–60. [Google Scholar] [CrossRef]
  36. Nanda, A.K.; Paul, P. Some results on generalized residual entropy. Inf. Sci. 2006, 176, 27–47. [Google Scholar] [CrossRef]
  37. Zografos, K. On reconsidering entropies and divergences and their cumulative counterparts: Csiszár’s, DPD’s and Fisher’s type cumulative and survival measures. Probab. Eng. Infor. Sci. 2022, 1–28. [Google Scholar] [CrossRef]
  38. Papaioannou, T.; Ferentinos, K. On two forms of Fisher’s measure of information. Commun. Stat.-Theory Methods 2005, 34, 1461–1470. [Google Scholar] [CrossRef]
  39. Frieden, B.R. Physics from Fisher information: A Unification; Cambridge University Press: New York, NY, USA, 1998. [Google Scholar]
  40. Tahmasebi, S.; Jafari, A.A. Fisher information number for concomitants of generalized order statistics in Morgenstern family. J. Inform. Math. Sci. 2013, 5, 15–20. [Google Scholar]
  41. Barbu, V.S.; Karagrigoriou, A.; Preda, V. Entropy and divergence rates for Markov chains: I. The Alpha-Gamma and Beta-Gamma case. Proc. Rom. Acad.-Ser. A 2017, 4, 293–301. [Google Scholar]
  42. Barbu, V.S.; Karagrigoriou, A.; Preda, V. Entropy and divergence rates for Markov chains: II. The weighted case. Proc. Rom. Acad.-Ser. A 2018, 19, 3–10. [Google Scholar]
  43. Barbu, V.S.; Karagrigoriou, A.; Preda, V. Entropy and divergence rates for Markov chains: III. The Cressie and Read case and applications. Proc. Rom. Acad.-Ser. A 2018, 2, 413–421. [Google Scholar]
  44. Villmann, T.; Haase, S. Mathematical Aspects of Divergence Based Vector Quantization Using Fréchet-Derivatives. Master’s Thesis, University of Applied Sciences Mittweida, Mittweida, Germany, 2010. [Google Scholar]
  45. Póczos, B.; Schneider, J. On the estimation of alpha-divergences. In Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, Fort Lauderdale, FL, USA, 11–13 April 2011; pp. 609–617. [Google Scholar]
  46. Kullback, S.; Leibler, R.A. On information and sufficiency. Ann. Math. Stat. 1951, 22, 79–86. [Google Scholar] [CrossRef]
  47. Kullback, S. Information Theory and Statistics; Dover Publications, Inc.: Mineola, NY, USA, 1959. [Google Scholar]
  48. Furuichi, S.; Yanagi, K.; Kuriyama, K. Fundamental properties of Tsallis relative entropy. J. Math. Phys. 2004, 45, 4868–4877. [Google Scholar] [CrossRef]
  49. Furuichi, S.; Mitroi, F.C. Mathematical inequalities for some divergences. Phys. A Stat. Mech. Its Appl. 2012, 391, 388–400. [Google Scholar] [CrossRef] [Green Version]
  50. Vigelis, R.F.; De Andrade, L.H.; Cavalcante, C.C. Properties of a generalized divergence related to Tsallis generalized divergence. IEEE Trans. Inf. Theory 2019, 66, 2891–2897. [Google Scholar] [CrossRef]
  51. Awad, A.; Alawneh, A. Application of entropy to a life-time model. IMA J. Math. Control Inf. 1987, 4, 143–148. [Google Scholar] [CrossRef]
  52. Sfetcu, R.C.; Sfetcu, S.C.; Preda, V. Ordering Awad–Varma Entropy and Applications to Some Stochastic Models. Mathematics 2021, 9, 280. [Google Scholar] [CrossRef]
  53. Furuichi, S. On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics. J. Math. Phys. 2009, 50, 013303. [Google Scholar] [CrossRef]
  54. Bercher, J.F. Some properties of generalized Fisher information in the context of nonextensive thermostatistics. Phys. A Stat. Mech. Its Appl. 2013, 392, 3140–3154. [Google Scholar] [CrossRef]
  55. Kharazmi, O.; Balakrishnan, N.; Jamali, H. Cumulative Residual q-Fisher Information and Jensen-Cumulative Residual χ2 Divergence Measures. Entropy 2022, 24, 341. [Google Scholar] [CrossRef]
  56. Sfetcu, R.C.; Sfetcu, S.C.; Preda, V. On Tsallis and Kaniadakis Divergences. Math. Phys. Anal. Geom. 2022, 25, 1–23. [Google Scholar] [CrossRef]
  57. Koukoumis, C.; Karagrigoriou, A. On entropy-type measures and divergences with applications in engineering, management and applied sciences. Int. J. Math. Eng. Manag. Sci. 2021, 6, 688. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Suter, F.; Cernat, I.; Drăgan, M. Some Information Measures Properties of the GOS-Concomitants from the FGM Family. Entropy 2022, 24, 1361. https://doi.org/10.3390/e24101361

AMA Style

Suter F, Cernat I, Drăgan M. Some Information Measures Properties of the GOS-Concomitants from the FGM Family. Entropy. 2022; 24(10):1361. https://doi.org/10.3390/e24101361

Chicago/Turabian Style

Suter, Florentina, Ioana Cernat, and Mihai Drăgan. 2022. "Some Information Measures Properties of the GOS-Concomitants from the FGM Family" Entropy 24, no. 10: 1361. https://doi.org/10.3390/e24101361

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop