Next Article in Journal
Energy and New Economic Approach for Nearly Zero Energy Hotels
Next Article in Special Issue
On the Complexity Analysis and Visualization of Musical Information
Previous Article in Journal
Some Notes on Maximum Entropy Utility
Previous Article in Special Issue
Analytical Solutions of Fractional-Order Diffusion Equations by Natural Transform Decomposition Method
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Entropy Formulation Based on the Generalized Liouville Fractional Derivative

by
Rui A. C. Ferreira
1,† and
J. Tenreiro Machado
2,*,†
1
Grupo Física-Matemática, Faculdade de Ciências, Universidade de Lisboa, Avenida Professor Gama Pinto, 2, 1649-003 Lisboa, Portugal
2
Institute of Engineering, Polytechnic of Porto, Department of Electrical Engineering, R. Dr. António Bernardino de Almeida, 431, 4249-015 Porto, Portugal
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Entropy 2019, 21(7), 638; https://doi.org/10.3390/e21070638
Submission received: 15 June 2019 / Revised: 21 June 2019 / Accepted: 25 June 2019 / Published: 28 June 2019
(This article belongs to the Special Issue The Fractional View of Complexity)

Abstract

:
This paper presents a new formula for the entropy of a distribution, that is conceived having in mind the Liouville fractional derivative. For illustrating the new concept, the proposed definition is applied to the Dow Jones Industrial Average. Moreover, the Jensen-Shannon divergence is also generalized and its variation with the fractional order is tested for the time series.

1. Introduction

The quest of generalizing the Boltzmann–Gibbs entropy has become an active field of research in the past 30 years. Indeed, many formulations appeared in the literature extending the well-known formula (see e.g., [1,2,3,4]):
S ( p ) = i ln ( p i ) p i .
The (theoretical) approaches to generalize Equation (1) may vary considerably (see e.g., [1,2,3]). In this work we are particularly interested in the method firstly proposed by Abe in [1], which consists of the basic idea of rewriting Equation (1) as
S ( p ) = i d d t p i t t = 1 = i d d t p i t t = 1 .
Concretely, we substitute the differential operator d d t in Equation (2) by a suitable fractional one (see Section 2 for the details) and then, after some calculations, we obtain a novel (at least to the best of our knowledge) formula, which depends on a parameter 0 < α 1 .
The paper is structured as follows. Section 2 introduces and discusses the motivation for the new entropy formulation. Section 3 analyses the Dow Jones Industrial Average. Additionally, The Jensen-Shannon divergence is also adopted in conjunction with the hierarchical clustering technique for analyzing regularities embedded in the time series. Finally, Section 4 outlines the conclusions.

2. Motivation

Let us introduce the following entropy function:
S α ( p ) = i Γ ( 1 ln ( p i ) ) Γ ( 1 ln ( p i ) α ) p i , α ( 0 , 1 ] ,
where Γ ( t ) = 0 x t 1 e x d x is the gamma function.
We define the quantity I α ( p i ) = Γ ( 1 ln ( p i ) ) Γ ( 1 ln ( p i ) α ) as the Liouville information (See Figure 1).
Our motivation to define the entropy function given by Equation (3) is essentially due to the works of Abe [1] and Ubriaco [4]. Indeed, in Section 3 of [4], the author notes (based on Abe’s work [1]) that the Boltzmann-Gibbs and the Tsallis entropies may be obtained by
S = i d d t p i t t = 1 ,
and
S = i d d q t p i t t = 1 , where d d q t f = f ( q t ) f ( t ) ( q 1 ) t ,
respectively. From this, he substitutes the above differential operator by a Liouville fractional derivative (see Section 2.3 in [5]) and then he defines a fractional entropy (see (19) in [4]). With this in mind and taking into account the generalization of the Liouville fractional derivative given by the “fractional derivative of a function with respect to another function” (see Section 2.5 in [5]) we consider using it in order to define a novel entropy. The Liouville fractional derivative of a function f with respect to another function g (with g > 0 ) is defined by [5,6],
D g α f ( t ) = 1 Γ ( 1 α ) g ( t ) d d t t [ g ( t ) g ( s ) ] α g ( s ) f ( s ) d s , 0 < α 1 .
It is important to keep in mind that our goal is to obtain an explicit formula for the entropy. Therefore, we can think that a “good” candidate for g is the exponential function, due to the fact that p i t = e t ln ( p i ) and also the structure of Equation (4). We chose g ( x ) = e x + 1 . Let us then calculate Equation (4) with f ( t ) = p i t and g ( x ) = e x + 1 . We obtain
D g α f ( t ) = 1 Γ ( 1 α ) e t + 1 d d t t [ e t + 1 e s + 1 ] α e s + 1 e s ln ( p i ) d s = 1 Γ ( 1 α ) e t + 1 d d t e α ( t + 1 ) t [ 1 e s t ] α e s ( 1 ln ( p i ) ) + 1 d s = 1 Γ ( 1 α ) e t + 1 d d t e α ( t + 1 ) 0 1 ( 1 u ) α e ( t + ln ( u ) ) ( 1 ln ( p i ) ) + 1 d u u = 1 Γ ( 1 α ) e t + 1 d d t e α ( t + 1 ) + t ( 1 ln ( p i ) ) + 1 0 1 ( 1 u ) α u ln ( p i ) d u = α + 1 ln ( p i ) Γ ( 1 α ) e t + 1 e α ( t + 1 ) + t ( 1 ln ( p i ) ) + 1 Γ ( 1 α ) Γ ( 1 ln ( p i ) ) Γ ( 2 α ln ( p i ) ) = ( 1 α ln ( p i ) ) e ( α 1 ) ( t + 1 ) + t ( 1 ln ( p i ) ) + 1 Γ ( 1 ln ( p i ) ) Γ ( 2 α ln ( p i ) ) .
It follows that,
D g α f ( 1 ) = ( 1 α ln ( p i ) ) p i Γ ( 1 ln ( p i ) ) Γ ( 2 α ln ( p i ) ) ,
and after using the property Γ ( x + 1 ) = x Γ ( x ) , x > 0 , we finally get
D g α f ( 1 ) = p i Γ ( 1 ln ( p i ) ) Γ ( 1 α ln ( p i ) ) .
We have, therefore, motivated the definition provided in Equation (3).
Remark 1.
We note that, if p i = 1 for some i N and α = 1 , then in Equation (3) we have a division by zero. In this case we are obviously thinking about the limit of that function, i.e.,
lim ( x , α ) ( 1 , 1 ) Γ ( 1 ln ( x ) ) Γ ( 1 ln ( x ) α ) x = 0 .
In addition, it is not hard to check that, for 0 < α 1 , we have
lim x 0 + Γ ( 1 ln ( x ) ) Γ ( 1 ln ( x ) α ) x = 0 .
Therefore, we put in Equation (3): Γ ( 1 ln ( 0 ) ) Γ ( 1 ln ( 0 ) α ) 0 = 0 , with 0 < α 1 .
Remark 2.
The entropy function defined in Equation (3) brings interesting challenges. For instance, though numerically the function
Γ ( 1 ln ( x ) ) Γ ( 1 ln ( x ) α ) x , x ( 0 , 1 )
for α ( 0 , 1 ) seems to be concave (see Figure 2), a rigorous proof of that fact was not yet obtained.

3. An Example of Application

The Dow Jones Industrial Average (DJIA) is an index based on the value of 30 large companies from the United States traded in the stock market during time. The DJIA and other financial indices reveal a fractal nature and has been the topic of many studies using distinct mathematical and computational tools [7,8]. In this section we apply the previous concepts in the study of the DJIA in order to verify the variation of the new expressions with the fractional order. Therefore, we start by analyzing the evolution of daily closing values of the DJIA from January 1, 1985, to April 5, 2019, in the perspective of Equation (3). All weeks include five days and missing values corresponding to special days are interpolated between adjacent values. For calculating the entropy we consider time windows of 149 days performing a total of n = 34 years.
Figure 3 and Figure 4 show the time evolution of the DJIA and the corresponding value of S α for α = 0 , 0 . 1 , , 0 . 9 , 1 .
We verify that S α ( t ) has a smooth evolution with α that plays the role of a parameters for adjusting the sensitivity of the entropy index.
The Jensen-Shannon divergence ( J S D ) measures the similarity between two probability distributions and is given by
J S D P | | Q = 1 2 D P | | M + 1 2 D Q | | M ,
where M = 1 2 P + Q , and D P | | M and D Q | | M represent the Kullback-Leibler divergence between distributions P and M, and P and Q, respectively.
For the classical Shannon information I p i = log p i the J S D can be calculated as:
J S D P | | Q = 1 2 i p i log p i + i q i log q i i m i log m i .
In the case of the Liouville information I α ( p i ) = Γ ( 1 ln ( p i ) ) Γ ( 1 ln ( p i ) α ) the J S D can be calculated as:
J S D P | | Q ; α = 1 2 i p i Γ 1 ln p i Γ 1 ln p i α + i q i Γ 1 ln q i Γ 1 ln q i α i m i Γ 1 ln m i Γ 1 ln m i α .
Obviously, for α = 1 we obtain the Shannon formulation.
For processing the data produced by the J S D we adopt hierarchical clustering (HC). The main objective of the HC is to group together (or to place far apart) objects that are similar (or different) [9,10,11,12]. The HC receives as input a symmetrical matrix D of distances (e.g., the J S D ) between the n items under analysis and produces as output a graph, in the form of a dendogram or a tree, where the length of the links represents the distance between data objects. We have two alternative algorithms, namely the agglomerative and the divisive clustering. In the first, each object starts in its own singleton cluster and, at each iteration of the HC scheme, the two most similar (in some sense) clusters are merged. The iterations stop when there is a single cluster containing all objects. In the second, all objects start in a single cluster and at each step, the HC removes the ‘outsiders’ from the least cohesive cluster. The iterations stop when each object is in its own singleton cluster. Both iterative schemes are achieved using an appropriate metric (a measure of the distance between pairs of objects) and a linkage criterion, which defines the dissimilarity between clusters as a function of the pairwise distance between objects.
In our case the objects correspond to the n = 34 years, from January 1, 1985, to December 31, 2018, that are compared using the J S D and resulting matrix D (with dimension n × n ) processed by means of HC.
Figure 5, Figure 6 and Figure 7 show the trees generated by the hierarchical clustering for the Shannon and the Liouville Jensen-Shannon divergence measures (with α = 0 . 1 , 0 . 5 ), respectively. The 2-digit labels of the ‘leafs’ of the trees denote the years.
We note that our goal is not to characterize the dynamics of the DJIA time evolution since it is outside the scope of this paper. In fact, we adopt the DJIA simply as a prototype data series for assessing the effect of changing the value of α in the J S D and consequently in the HC generated tree. We verify that in general there is a strong similarity of the DJIA between consecutive years. In what concerns the use of the Shannon versus the Liouville J S D , we observe that for α close to 1 both entropies lead to identical results, while for α close to 0 the Liouville J S D produces a distinct tree. Therefore, we conclude that we can adjust the clustering performance by a proper tuning of the parameter α .

4. Conclusions

This paper presented a new formulation for entropy based on the (generalized) Liouville definition of fractional derivative. The generalization leads not only to a new entropy index, but also to novel expressions for fractional information and Jensen-Shannon divergence. The sensitivity of the proposed expression to variations of the fractional order is tested for the DJIA time series.

Author Contributions

Conceptualization, R.A.C.F.; Data curation, J.T.M.; Formal analysis, R.A.C.F.; Methodology, R.A.C.F.; Software, J.T.M.; Visualization, J.T.M.; Writing—original draft, R.A.C.F. and J.T.M.

Funding

Fundação para a Ciência e a Tecnologia (FCT): IF/01345/2014.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Abe, S. A note on the q-deformation-theoretic aspect of the generalized entropies in nonextensive physics. Phys. Lett. A 1997, 224, 326–330. [Google Scholar] [CrossRef]
  2. Machado, J.A.T. Fractional Order Generalized Information. Entropy 2014, 16, 2350–2361. [Google Scholar] [CrossRef] [Green Version]
  3. Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  4. Ubriaco, M.R. Entropies based on fractional calculus. Phys. Lett. A 2009, 373, 2516–2519. [Google Scholar] [CrossRef] [Green Version]
  5. Kilbas, A.; Srivastava, H.; Trujillo, J. Theory and Applications of Fractional Differential Equations; Elsevier: Amsterdam, The Netherlands, 2006. [Google Scholar]
  6. Samko, S.; Kilbas, A.; Marichev, O. Fractional Integrals and Derivatives: Theory and Applications; Gordon and Breach Science Publishers: Yverdon-les-Bains, Switzerland, 1993. [Google Scholar]
  7. Machado, J.A.T. Complex Dynamics of Financial Indices. Nonlinear Dyn. 2013, 74, 287–296. [Google Scholar] [CrossRef]
  8. Machado, J.A.T. Relativistic Time Effects in Financial Dynamics. Nonlinear Dyn. 2014, 75, 735–744. [Google Scholar] [CrossRef]
  9. Hartigan, J.A. Clustering Algorithms; John Wiley & Sons: New York, NY, USA, 1975. [Google Scholar]
  10. Sokal, R.R.; Rohlf, F.J. The comparison of dendrograms by objective methods. Taxon 1962, 11, 33–40. [Google Scholar] [CrossRef]
  11. Kaufman, L.; Rousseeuw, P.J. Finding Groups in Data - An Introduction to Cluster Analysis; Wiley-Interscience: Hoboken, NY, USA, 2005. [Google Scholar]
  12. Maimon, O.; Rokach, L. Data Mining and Knowledge Discovery Handbook; Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar]
Figure 1. Liouville information I α ( p i ) = Γ ( 1 ln ( p i ) ) Γ ( 1 ln ( p i ) α ) .
Figure 1. Liouville information I α ( p i ) = Γ ( 1 ln ( p i ) ) Γ ( 1 ln ( p i ) α ) .
Entropy 21 00638 g001
Figure 2. Plot of p i Γ ( 1 ln ( p i ) ) Γ ( 1 ln ( p i ) α ) versus p i .
Figure 2. Plot of p i Γ ( 1 ln ( p i ) ) Γ ( 1 ln ( p i ) α ) versus p i .
Entropy 21 00638 g002
Figure 3. Evolution of the Dow Jones Industrial Average (DJIA) versus time t from January 1, 1985, to April 5, 2019.
Figure 3. Evolution of the Dow Jones Industrial Average (DJIA) versus time t from January 1, 1985, to April 5, 2019.
Entropy 21 00638 g003
Figure 4. Evolution of the S α ( t ) versus time t from January 1, 1985, to April 5, 2019, and α = 0 , 0 . 1 , , 0 . 9 , 1 .
Figure 4. Evolution of the S α ( t ) versus time t from January 1, 1985, to April 5, 2019, and α = 0 , 0 . 1 , , 0 . 9 , 1 .
Entropy 21 00638 g004
Figure 5. Tree generated by the hierarchical clustering for the Shannon J S D .
Figure 5. Tree generated by the hierarchical clustering for the Shannon J S D .
Entropy 21 00638 g005
Figure 6. Tree generated by the hierarchical clustering for the Liouville J S D and α = 0 . 1 .
Figure 6. Tree generated by the hierarchical clustering for the Liouville J S D and α = 0 . 1 .
Entropy 21 00638 g006
Figure 7. Tree generated by the hierarchical clustering for the Liouville J S D and α = 0 . 5 .
Figure 7. Tree generated by the hierarchical clustering for the Liouville J S D and α = 0 . 5 .
Entropy 21 00638 g007

Share and Cite

MDPI and ACS Style

Ferreira, R.A.C.; Tenreiro Machado, J. An Entropy Formulation Based on the Generalized Liouville Fractional Derivative. Entropy 2019, 21, 638. https://doi.org/10.3390/e21070638

AMA Style

Ferreira RAC, Tenreiro Machado J. An Entropy Formulation Based on the Generalized Liouville Fractional Derivative. Entropy. 2019; 21(7):638. https://doi.org/10.3390/e21070638

Chicago/Turabian Style

Ferreira, Rui A. C., and J. Tenreiro Machado. 2019. "An Entropy Formulation Based on the Generalized Liouville Fractional Derivative" Entropy 21, no. 7: 638. https://doi.org/10.3390/e21070638

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop