Next Article in Journal
Turing Instability and Spatiotemporal Pattern Formation Induced by Nonlinear Reaction Cross-Diffusion in a Predator–Prey System with Allee Effect
Next Article in Special Issue
Decomposition of Finitely Additive Markov Chains in Discrete Space
Previous Article in Journal
Relaxation Oscillations and Dynamical Properties in a Time Delay Slow–Fast Predator–Prey Model with a Piecewise Smooth Functional Response
Previous Article in Special Issue
Fourth Cumulant Bound of Multivariate Normal Approximation on General Functionals of Gaussian Fields
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

On Geometric Mean and Cumulative Residual Entropy for Two Random Variables with Lindley Type Distribution

by
Marius Giuclea
1,2,* and
Costin-Ciprian Popescu
1
1
Department of Applied Mathematics, Bucharest University of Economic Studies, Calea Dorobanţi, 15-17, 010552 Bucharest, Romania
2
Institute of Solid Mechanics, Romanian Academy, 15 Constatin Mille, 010141 Bucharest, Romania
*
Author to whom correspondence should be addressed.
Mathematics 2022, 10(9), 1499; https://doi.org/10.3390/math10091499
Submission received: 23 March 2022 / Revised: 24 April 2022 / Accepted: 28 April 2022 / Published: 30 April 2022
(This article belongs to the Special Issue Probability, Stochastic Processes and Optimization)

Abstract

:
In this paper, we focus on two generalizations of the Lindley distribution and investigate, for each one separately, some special properties related to the geometric mean ( G M ) and the cumulative residual entropy ( C R E ), both of them being of great importance from the theoretical as well as from the practical point of view.

1. Introduction

One of the most widely used numerical characteristics of a random variable is its mean. If X is a continuous random variable whose values are strictly positive and the probability density function of X is f ( x ) , then the geometric mean [1,2] is
G M X = e 0 ln x f x d x ,
where x > 0 .
The concept of geometric mean has various uses [1,3,4,5,6,7] in many fields of science. A detailed approach can be found in [1]. The formulas for the geometric mean of some probability distributions are also provided in [1]. In the present work, one of the topics of discussion is the geometric mean of two continuous random variables that will be specified in the next section.
Another look at a random variable is given by information theory. In this framework, a central role is played by the concept of entropy, which is a measure of uncertainty. If X is a discrete random variable with possible values x i , i = 1 , , n , n N * and
p i = P X = x i , i { 1 , n } ,
Shannon entropy of X [8] is
H X = i = 1 n p i log a p i .
The basis of the logarithm can be 2 but, more generally, it can be chosen depending on the application. If this base is equal to the number e, then it is obtained
H X = i = 1 n p i ln p i .
If X is a continuous random variable with the probability density function f ( x ) and D is the set where f ( x ) is strictly positive, then the differential entropy of X [9] is
h x = D f x ln f x d x .
The differential entropy of a continuous random variable has some interesting properties [9] but compared to Shannon entropy for the discrete case it has certain limitations [10] that must be taken into account. For example, the Shannon entropy is positive but the differential entropy does not always have this property. To overcome such inconveniences, another measure of uncertainty is proposed [10], namely the cumulative residual entropy. If X is a non-negative random variable with cumulative distribution function F ( x ) , then the cumulative residual entropy of X is
E X = 0 F ¯ x ln F ¯ ( x ) d x ,
where
F ¯ x = 1 F x .
In [10] some properties of the cumulative residual entropy are given and the relationship between it and the differential entropy is established. Also in [10], the usefulness of C R E in reliability engineering and computer vision is shown. In various works, the concept of C R E is a good starting point for obtaining new and interesting results. For instance, in [11], the Bayesian estimator of the dynamic cumulative residual Rényi entropy is discussed. In [12], there are studied some properties of dynamic cumulative residual entropy and in [13] is investigated the C R E for coherent and mixed systems where the component lifetimes are identically distributed. In [14] is generated the C R E for the case of fractional order and its properties are given, and in [15] is proposed a consistent estimator for C R E , which has the property that its asymptotic distribution is normal.
The Lindley distribution [16,17] is one of the random variables that is important not only for its direct applications but also for the many theoretical developments that have followed it. For instance, in [17], some of its characteristics such as moments, entropies and so on, are extensively studied. In addition, the Lindley distribution is proposed for modeling the waiting time in a bank [17]. The probability density function of the Lindley distribution is
f x ; θ : 0 , R , f x ; θ = θ 2 θ + 1 1 + x e θ x ,
with θ > 0 .
The cumulative distribution function of the Lindley distribution [17] is
F x ; θ = 1 1 + θ + θ x 1 + θ e θ x , x > 0 .
Regarding the developments based on the Lindley distribution, it is worth noting the introduction of new random variables [18,19,20,21,22,23,24,25,26,27]. In [18], two new families of distributions with applications in repairable data are considered. A new model, namely the generalized Lindley of integer order is given in [19] and its application in studying some medical data is also emphasized. In [20], a new distribution that can be used in insurance is proposed. The model of distribution discussed in [21] is suitable in reliability and fatigue life probems. In [22], a three-parameter Lindley distribution is introduced. A five-parameter generalized Lindley distribution is given in [23]. It was used in the study of four data sets, among them a set of medical data and a set of data regarding the strength of glass in a certain environment [23]. A discrete Lindley distribution is given in [24]. It is compared with geometric and Poisson distributions and its usefulness in analyzing some data sets, including medical data, is studied. A Lindley distribution of discrete type is given in [25] and it is employed in the study of automobile claim data, a situation in which it is compared with the Poisson model. In [26], a distribution called exponential-modified discrete Lindley distribution is proposed and used in modelling exceedances of flood peaks for a river or the period between earthquakes having a certain magnitude. The three-parameter Lindley distribution given in [22] is considered in [27] where some medical data are modeled. In the present paper, two continuous distributions [22,23] that generalize the Lindley distribution are discussed. Following the results already obtained [22,23], some new relationships regarding these two distributions are given.

2. Preliminaries Materials and Methods

This work focuses on two random variables that are related to the Lindley distribution. It is about a continuous random variable with three parameters [22] and one with five parameters [23]. For each one, the geometric mean and the cumulative residual entropy will be determined. There is a relationship between cumulative residual entropy and differential entropy [10] but in this paper the formulas for the cumulative residual entropy will be deduced using only its definition. For both random variables that will be analyzed we will consider that all parameters are strictly positive, except for β that is nonnegative. The three-parameter Lindley distribution X [22] has the probability density function
f X x ; θ , α , μ : 0 , R , f X x ; θ , α , μ = θ 2 θ α + μ α + μ x e θ x .
The corresponding cumulative distribution function is [22]
F X x ; θ , α , μ : R R , F X x ; θ , α , μ = 1 1 + θ μ x θ α + μ e θ x , x > 0 0 , x 0 .
The five-parameter Lindley distribution Y [23] has the probability density function
f Y y ; δ , α , η , θ , β : β , R , f Y y ; δ , α , η , θ , β = θ δ α + η [ δ α + η θ y β ] e θ y β .
In this case, the cumulative distribution function is [23]
F Y y ; δ , α , η , θ , β : R R , F Y y ; δ , α , η , θ , β = 1 1 + θ η y β δ α + η e θ y β , y > β 0 , y β .
The three-parameter distribution [22] can be viewed as a sub-model of the five-parameter distribution [23] because the five-parameter distribution reduces to the three-parameter distribution for β = 0 , δ = θ and η = μ [23]. Some details about the relations between the parameters of these two random variables are given in [23].
In the next section of the paper, some notions and results related to mathematical analysis will be used. These are briefly presented below.
The Euler–Mascheroni constant is
γ = lim n k = 1 n 1 k ln n 0.57721
and one of the ways this constant can be written [28] is
γ = 0 e x ln x d x .
If p > 0 , gamma function [29] is defined as
Γ p = 0 x p 1 e x d x .
Among the many properties of the gamma function [29], there are the following relationships:
Γ 1 = 0 e x d x = 1
and
Γ n = n 1 ! , for n N , n 2 .
The integral
E 1 x = x e t t d t
is related to the exponential integral [30].

3. Results

Theorem 1.
If X is a random variable having the probability density function
f X x ; θ , α , μ : 0 , R , f X x ; θ , α , μ = θ 2 θ α + μ α + μ x e θ x ,
with θ > 0 , α > 0 , μ > 0 , then
G M X = 1 θ e μ θ α + μ γ ,
where γ is the Euler–Mascheroni constant.
Proof. 
We have
G M X = e I 1 ,
where
I 1 = 0 ln x f X x ; θ , α , μ d x .
Consider the integrals
J 1 = 0 ln x e θ x d x , J 2 = 0 x ln x e θ x d x .
We have
J 1 = 0 ln x e θ x d x = 0 ln t θ e t 1 θ d t = = 1 θ 0 ln t e t d t ln θ 0 e t d t = γ ln θ θ .
Consider
J 21 = 0 w x ln x e θ x d x , J 22 = w x ln x e θ x d x ,
where w 0 , .
We have
J 21 = lim u 0 u > 0 u w x ln x e θ x d x = lim u 0 u > 0 u w x ln x e θ x θ d x = = lim u 0 u > 0 1 θ w ln w e θ w u ln u e θ u + 1 θ u w 1 + ln x e θ x d x = = 1 θ w ln w e θ w + 1 θ lim u 0 u > 0 θ u θ w 1 + ln t θ e t 1 θ d t = = 1 θ w ln w e θ w + 1 θ 2 lim u 0 u > 0 θ u θ w ln t e t d t + 1 ln θ θ u θ w e t d t
and
J 22 = lim v w v x ln x e θ x d x = lim v w v x ln x e θ x θ d x = = lim v 1 θ v ln v e θ v w ln w e θ w + 1 θ w v 1 + ln x e θ x d x = = 1 θ w ln w e θ w + 1 θ lim v θ w θ v 1 + ln t θ e t 1 θ d t = = 1 θ w ln w e θ w + 1 θ 2 lim v θ w θ v ln t e t d t + 1 ln θ θ w θ v e t d t .
We obtain
J 2 = J 21 + J 22 = 1 θ 2 0 ln t e t d t + 1 ln θ 0 e t d t = 1 θ 2 γ + 1 ln θ .
Finally,
I 1 = 0 ln x f x ; θ , α , μ d x = θ 2 θ α + μ 0 ln x α + μ x e θ x d x = = θ 2 θ α + μ α J 1 + μ J 2 = θ 2 θ α + μ α γ ln θ θ + μ γ + 1 ln θ θ 2 = = ln θ + μ θ α + μ γ
and
G M X = e I 1 = 1 θ e μ θ α + μ γ .
Theorem 2.
If Y is a random variable having the probability density function
f Y y ; δ , α , η , θ , β : β , R , f Y y ; δ , α , η , θ , β = θ δ α + η [ δ α + η θ y β ] e θ y β ,
with δ , α , η , θ 0 , , β 0 , , then
G M ( Y ) = e I 2 , β > 0 1 θ e η δ α + η γ , β = 0 ,
where
I 2 = ln β + η δ α + η + 1 η θ β δ α + η e θ β E 1 θ β .
Proof. 
If β > 0 , we have
I 2 = β ln y f Y y ; δ , α , η , θ , β d y = 1 δ α + η lim v β v ln y [ δ α + η θ y β ] θ e θ y β d y = = 1 δ α + η lim v 0 θ v β ln θ β + z θ ( δ α + η z ) e z d z = = 1 δ α + η lim v 0 θ v β ln θ β + z θ ( δ α + η z ) ( e z ) d z = = 1 δ α + η lim v ln θ β + z θ ( δ α + η z ) ( e z ) 0 θ v β + + 0 θ v β δ α + η z θ β + z + η ln θ β + z θ e z d z = = δ α ln β δ α + η + 1 δ α + η lim v 0 θ v β η + δ α η θ β θ β + z + η ln θ β + z θ e z d z = = δ α ln β δ α + η + 1 δ α + η lim v η 0 θ v β e z d z + 0 θ v β δ α η θ β θ β + z e z d z + + η 0 θ v β ln θ β + z θ e z d z = = η Γ 1 + δ α ln β δ α + η + 1 δ α + η lim v 0 θ v β δ α η θ β θ β + z e z d z + + η 0 θ v β ln θ β + z θ ( e z ) d z = = η + δ α ln β δ α + η + 1 δ α + η lim v 0 θ v β δ α η θ β θ β + z e z d z η ln θ β + z θ e z 0 θ v β + η 0 θ v β e z θ β + z d z = = η + ( δ α + η ) ln β δ α + η + 1 δ α + η lim v δ α η θ β 0 θ v β e z θ β + z d z + η 0 θ v β e z θ β + z d z = = ln β + η δ α + η + δ α η θ β + η δ α + η lim v 0 θ v β e z θ β + z d z = = ln β + η δ α + η + 1 η θ β δ α + η e θ β lim v 0 θ v β e θ β z θ β + z d z = = ln β + η δ α + η + 1 η θ β δ α + η e θ β lim v θ β θ v e t t d t = = ln β + η δ α + η + 1 η θ β δ α + η e θ β θ β e t t d t = = ln β + η δ α + η + 1 η θ β δ α + η e θ β E 1 θ β .
If β = 0 , we have
G M ( Y ) = e I 3 ,
where
I 3 = 0 ln y θ δ α + η ( δ α + η θ y ) e θ y d y = θ δ α + η δ α J 1 + η θ J 2 = = θ δ α + η δ α γ ln θ θ + η θ γ + 1 ln θ θ 2 = ln θ + η δ α + η γ .
Theorem 3.
If Y is a random variable having the cumulative distribution function
F Y y ; δ , α , η , θ , β : R R , F Y y ; δ , α , η , θ , β = 1 1 + θ η y β δ α + η e θ y β , y > β 0 , y β ,
with δ , α , η , θ 0 , , β 0 , , then
E Y = 1 θ δ α + η δ α + 2 η η e δ α + η η E 1 δ α + η η .
Proof. 
We have
F ¯ Y y ; δ , α , η , θ , β = 1 + θ η y β δ α + η e θ y β , for y > β ,
and
E Y = β F ¯ Y y ; δ , α , η , θ , β ln F ¯ Y y ; δ , α , η , θ , β d y = = lim v β v 1 + θ η y β δ α + η e θ y β ln 1 + θ η y β δ α + η e θ y β d y = = 1 θ lim v 0 θ v β 1 + η z δ α + η e z ln 1 + η z δ α + η e z d z = = 1 θ lim v 0 θ v β 1 + η z δ α + η e z z + ln 1 + η z δ α + η d z = = 1 θ lim v 0 θ v β z e z + η δ α + η z 2 e z 1 + η z δ α + η ln 1 + η z δ α + η e z d z = = 1 θ Γ 2 + η δ α + η Γ 3 lim v 0 θ v β 1 + η z δ α + η ln 1 + η z δ α + η e z d z = = 1 θ Γ 2 + η δ α + η Γ 3 lim v 0 θ v β δ α + η + η z δ α + η ln δ α + η + η z δ α + η e z d z = = 1 θ 1 + 2 η δ α + η 1 θ lim v 0 θ v β δ α + η + η z δ α + η ln δ α + η + η z δ α + η ( e z ) d z = = δ α + 3 η θ δ α + η 1 θ lim v δ α + η + η z δ α + η ln δ α + η + η z δ α + η e z 0 θ v β + + η δ α + η 0 θ v β 1 + ln δ α + η + η z δ α + η e z d z = = δ α + 3 η θ δ α + η η θ δ α + η lim v 0 θ v β e z + ln δ α + η + η z δ α + η e z d z = = δ α + 3 η θ δ α + η η θ δ α + η Γ 1 + lim v 0 θ v β ln δ α + η + η z δ α + η ( e z ) d z = = δ α + 3 η θ δ α + η η θ δ α + η η θ δ α + η lim v e z ln δ α + η + η z δ α + η 0 θ v β + + 0 θ v β η δ α + η + η z e z d z = = δ α + 2 η θ δ α + η η θ δ α + η lim v 0 θ v β η δ α + η + η z e z d z = = δ α + 2 η θ δ α + η η θ δ α + η e δ α + η η lim v 0 θ v β η δ α + η + η z e z δ α + η η d z = = δ α + 2 η θ δ α + η η θ δ α + η e δ α + η η lim v δ α + η η θ v β + δ α + η η 1 t e t d t = = δ α + 2 η θ δ α + η η θ δ α + η e δ α + η η δ α + η η 1 t e t d t = = 1 θ δ α + η δ α + 2 η η e δ α + η η E 1 δ α + η η .
Theorem 4.
If X is a random variable having the cumulative distribution function
F X x ; θ , α , μ : R R , F X x ; θ , α , μ = 1 1 + θ μ x θ α + μ e θ x , x > 0 0 , x 0 ,
with θ > 0 , α > 0 , μ > 0 , then
E X = 1 θ θ α + μ θ α + 2 μ μ e θ α + μ μ E 1 θ α + μ μ .
Proof. 
The proof comes directly from Theorem 3, by choosing β = 0 , δ = θ and η = μ . □

4. Discussion

Regarding the characteristics of the random variables, one can notice that in some papers the geometric mean is considered [1,2,3,4,5,6,7]. In the field of the study of uncertainty related to a random variable, the cumulative residual entropy [10] overcomes some drawbacks of differential entropy.
In this paper, two generalizations of the Lindley distribution [22,23] were discussed. The three-parameter distribution [22] is a submodel of the five-parameter [23] one. The work focused on the geometric mean and cumulative residual entropy of these two distributions. The cumulative residual entropy of the one with three parameters can be deduced directly from the one with five parameters, as shown in Theorems 3 and 4.
In connection with the geometric mean, remark that the integral I 2 from Theorem 2 can be transformed as follows:
I 2 = ln β + η δ α + η + 1 η θ β δ α + η e θ β E 1 θ β = = ln β + η δ α + η + 1 η θ β δ α + η e θ β θ β e t t d t = = ln β + η δ α + η + 1 η θ β δ α + η e θ β lim v θ β v e t t d t = = ln β + η δ α + η + 1 η θ β δ α + η e θ β lim v θ β v e t ln t d t = = ln β + η δ α + η + 1 η θ β δ α + η e θ β lim v e t ln t θ β v + θ β v e t ln t d t = = ln β + η δ α + η + 1 η θ β δ α + η ln θ ln β + e θ β θ β e t ln t d t = = η δ α + η + η θ β ln β δ α + η + 1 η θ β δ α + η ln θ + e θ β θ β e t ln t d t .
We have
lim β 0 β > 0 η δ α + η + η θ β ln β δ α + η + 1 η θ β δ α + η ln θ + e θ β θ β e t ln t d t = = η δ α + η ln θ + 0 e t ln t d t = ln θ + η δ α + η γ .
Therefore the geometric mean of the five-parameter distribution is right continuous at zero with respect to the parameter β . By taking, in Theorem 2, β = 0 and then making the substitutions δ = θ , η = μ , the geometric mean of the three-parameter distribution with three parameters can be deduced from the geometric mean of the five-parameter distribution. Due to the special position of the parameter β in the calculation of integrals, the geometric mean was independently calculated for each distribution, as seen in Theorems 1 and 2.

5. Conclusions

From the rather large set of Lindley-type distributions, two related distributions were selected for study. For each of them, the formulas for geometric mean and cumulative residual entropy were obtained. These results are in addition to those already known from previous works, thus increasing the area of knowledge concerning the theme of Lindley-type distributions.

Author Contributions

Conceptualization, M.G. and C.-C.P.; methodology, M.G. and C.-C.P.; writing—original draft preparation, M.G. and C.-C.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Vogel, R.M. The geometric mean? Commun. Stat.—Theory Methods 2022, 51, 82–94. [Google Scholar] [CrossRef]
  2. Feng, C.; Wang, H.; Tu, X.M. Geometric mean of nonnegative random variable. Commun. Stat.—Theory Methods 2013, 42, 2714–2717. [Google Scholar] [CrossRef]
  3. Abyani, M.; Asgarian, B.; Zarrin, M. Sample geometric mean versus sample median in closed form framework of seismic reliability evaluation: A case study comparison. Earthq. Eng. Eng. Vib. 2019, 18, 187–201. [Google Scholar] [CrossRef]
  4. Mahajan, S. Don’t demean the geometric mean. Am. J. Phys. 2019, 87, 75–77. [Google Scholar] [CrossRef]
  5. Martinez, M.N.; Bartholomew, M.J. What does it “mean”? A review of interpreting and calculating different types of means and standard deviations. Pharmaceutics 2017, 9, 14. [Google Scholar] [CrossRef] [Green Version]
  6. Selvadurai, P.A.; Selvadurai, A.P.S. On the effective permeability of a heterogenous porous medium: The role of the geometric mean. Philos. Mag. 2014, 94, 2318–2338. [Google Scholar] [CrossRef]
  7. Thelwall, M. The precision of the arithmetic mean, geometric mean and percentiles for citation data: An experimental simulation modelling approach. J. Inf. 2016, 10, 110–123. [Google Scholar] [CrossRef] [Green Version]
  8. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  9. Cover, T.M.; Thomas, J.A. Elements of Information Theory, 2nd ed.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2006; pp. 243–259. [Google Scholar]
  10. Rao, M.; Chen, Y.; Vemuri, B.C.; Wang, F. Cumulative residual entropy: A new measure of information. IEEE Trans. Inf. Theory 2004, 50, 1220–1228. [Google Scholar] [CrossRef]
  11. Almarashi, A.M.; Algarni, A.; Hassan, A.S.; Zaky, A.N.; Elgarhy, M. Bayesian analysis of dynamic cumulative residual entropy for Lindley distribution. Entropy 2021, 23, 1256. [Google Scholar] [CrossRef]
  12. Asadi, M.; Zohrevand, Y. On the dynamic cumulative residual entropy. J. Stat. Plan. Inference 2007, 137, 1931–1941. [Google Scholar] [CrossRef]
  13. Toomaj, A.; Sunoj, S.M.; Navarro, J. Some properties of the cumulative residual entropy of coherent and mixed systems. J. Appl. Probab. 2017, 54, 379–393. [Google Scholar] [CrossRef]
  14. Xiong, H.; Shang, P.; Zhang, Y. Fractional cumulative residual entropy. Commun. Nonlinear Sci. Numer. Simul. 2019, 78. [Google Scholar] [CrossRef]
  15. Zardasht, V.; Parsi, S.; Mousazadeh, M. On empirical cumulative residual entropy and a goodness-of-fit test for exponentiality. Stat. Pap. 2015, 56, 677–688. [Google Scholar] [CrossRef] [Green Version]
  16. Lindley, D.V. Fiducial distributions and Bayes’ theorem. J. R. Stat. Soc. Ser. B 1958, 20, 102–107. [Google Scholar] [CrossRef]
  17. Ghitany, M.E.; Atieh, B.; Nadarajah, S. Lindley distribution and its application. Math. Comput. Simul. 2008, 78, 493–506. [Google Scholar] [CrossRef]
  18. Abd El-Bar, A.M.T.; da Silva, W.B.F.; Nascimento, A.D.C. An extended log-Lindley-G family: Properties and experiments in repairable data. Mathematics 2021, 9, 3108. [Google Scholar] [CrossRef]
  19. Abouammoh, A.; Kayid, M. A new flexible generalized Lindley model: Properties, estimation and applications. Symmetry 2020, 12, 1678. [Google Scholar] [CrossRef]
  20. Gómez-Déniz, E.; Sordo, M.A.; Calderín-Ojeda, E. The Log-Lindley distribution as an alternative to the beta regression model with applications in insurance. Insur. Math. Econ. 2014, 54, 49–57. [Google Scholar] [CrossRef]
  21. Korkmaz, M.Ç.; Yousof, H.M. The one-parameter odd Lindley exponential model: Mathematical properties and applications. Stochastics Qual. Control 2017, 32, 25–35. [Google Scholar] [CrossRef]
  22. Shanker, R.; Shukla, K.K.; Shanker, R.; Leonida, T.A. A three-parameter Lindley distribution. Am. J. Math. Stat. 2017, 7, 15–26. [Google Scholar] [CrossRef]
  23. Tharshan, R.; Wijekoon, P. A comparison study on a new five-parameter generalized Lindley distribution with its sub-models. Stat. Transit. New Ser. 2020, 21, 89–117. [Google Scholar] [CrossRef]
  24. Bakouch, H.S.; Jazi, M.A.; Nadarajah, S. A new discrete distribution. Statistics 2014, 48, 200–240. [Google Scholar] [CrossRef]
  25. Gómez-Déniz, E.; Calderín-Ojeda, E. The discrete Lindley distribution: Properties and applications. J. Stat. Comput. Simul. 2011, 81, 1405–1416. [Google Scholar] [CrossRef]
  26. Yilmaz, M.; Hameldarbandi, M.; Kemaloglu, S.A. Exponential-modified discrete Lindley distribution. SpringerPlus 2016, 5, 1660. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  27. Thamer, M.K.; Zine, R. Comparison of five methods to estimate the parameters for the three-parameter Lindley distribution with application to life data. Comput. Math. Methods Med. 2021, 2021, 2689000. [Google Scholar] [CrossRef]
  28. Lagarias, J.C. Euler’s constant: Euler’s work and modern developments. Bull. Amer. Math. Soc. 2013, 50, 527–628. [Google Scholar] [CrossRef]
  29. Whittaker, E.T.; Watson, G.N. A Course of Modern Analysis, 4th ed.; Cambridge University Press: Cambridge, UK, 1996; pp. 235–264. [Google Scholar]
  30. Gautschi, W.; Cahill, W.F. Exponential integral and related functions. In Handbook of Mathematical Functions with Formulas, Graphs and Mathematical Tables; Abramowitz, M., Stegun, I.A., Eds.; Dover Publications: New York, NY, USA, 1965; pp. 227–252. [Google Scholar]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Giuclea, M.; Popescu, C.-C. On Geometric Mean and Cumulative Residual Entropy for Two Random Variables with Lindley Type Distribution. Mathematics 2022, 10, 1499. https://doi.org/10.3390/math10091499

AMA Style

Giuclea M, Popescu C-C. On Geometric Mean and Cumulative Residual Entropy for Two Random Variables with Lindley Type Distribution. Mathematics. 2022; 10(9):1499. https://doi.org/10.3390/math10091499

Chicago/Turabian Style

Giuclea, Marius, and Costin-Ciprian Popescu. 2022. "On Geometric Mean and Cumulative Residual Entropy for Two Random Variables with Lindley Type Distribution" Mathematics 10, no. 9: 1499. https://doi.org/10.3390/math10091499

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop