Next Article in Journal
Image Encryption Using Quantum 3D Mobius Scrambling and 3D Hyper-Chaotic Henon Map
Previous Article in Journal
Bosonic Representation of Matrices and Angular Momentum Probabilistic Representation of Cyclic States
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Fisher and Shannon Functionals for Hyperbolic Diffusion

1
Comision Nacional de Energia Atomica, Centro Atomico Bariloche and Instituto Balseiro, Universidad Nacional de Cuyo, Av. E. Bustillo 9500, Bariloche CP 8400, Argentina
2
CONICET, Centro Atomico Bariloche, Av. E. Bustillo 9500, Bariloche CP 8400, Argentina
3
Departamento de Fisica, Facultad de Ingenieria and CONICET, Universidad Nacional del Comahue, Neuquen CP 8300, Argentina
4
Departamento de Física, Facultad de Ingeniería, Universidad Nacional de Mar del Plata (UNMDP), CONICET, Mar del Plata CP 7600, Argentina
5
Departamento de Física, Universidad Católica del Norte, Av. Angamos 0610, Antofagasta 1270709, Chile
*
Author to whom correspondence should be addressed.
Entropy 2023, 25(12), 1627; https://doi.org/10.3390/e25121627
Submission received: 31 October 2023 / Revised: 1 December 2023 / Accepted: 2 December 2023 / Published: 6 December 2023
(This article belongs to the Special Issue Theory and Applications of Hyperbolic Diffusion and Shannon Entropy)

Abstract

:
The complexity measure for the distribution in space-time of a finite-velocity diffusion process is calculated. Numerical results are presented for the calculation of Fisher’s information, Shannon’s entropy, and the Cramér–Rao inequality, all of which are associated with a positively normalized solution to the telegrapher’s equation. In the framework of hyperbolic diffusion, the non-local Fisher’s information with the x-parameter is related to the local Fisher’s information with the t-parameter. A perturbation theory is presented to calculate Shannon’s entropy of the telegrapher’s equation at long times, as well as a toy model to describe the system as an attenuated wave in the ballistic regime (short times).

1. Introduction

Advances in understanding wave propagation in a conducting medium were achieved through the analysis of the telegrapher’s equation (TE), which originally appeared in the study of electromagnetic fields in waveguides [1,2,3]. This hyperbolic diffusion equation has been used in different areas of research, including the hyperbolic heat equation [4], generalized Cattaneo–Fick equations [5,6], neuroscience [7,8], biomedical optics [9], electromagnetic analysis in multilayered conductor planes [10], penetration of waves in complex conducting media [11], asymptotic diffusion from Boltzmann anisotropic scattering [12,13,14], TE in 2D and 3D for engineers problems [15], describing cosmic microwave background radiation with spherically hyperbolic diffusion [16,17], finite-velocity diffusion in heterogeneous media [18,19,20,21], as well as in the damping and propagation of surface gravity waves on a random bottom [22].
The study of the propagation of thermal waves is also a fundamental theoretical and experimental subject [23,24], where the temperature profile ψ x , t > 0 can be described by the TE:
t 2 + 1 τ t v 2 x 2 ψ x , t = 0 ,
thus, the wave packet propagates at the velocity (v) and is attenuated at a rate of τ 1 . The same TE serves as the starting point for studying electromagnetic field transport in waveguides, where Ohm’s law plays a fundamental role in describing the conducting media and characterizing the dissipative parameter τ 1 [1,2]. In this case, an electromagnetic dissipative wave is the solution to (1), but it is not necessary to impose positivity and normalization on it. It is interesting to note that Equation (1) has two extreme cases:
  • The wave limit
Taking τ , we recover the wave equation: t 2 v 2 x 2 ψ WE x , t = 0 . Then, its solution can then be considered a wave packet that moves either to the right or left, represented as ψ WE x ± v t , without changing its form throughout the whole domain.
  • The diffusion limit
Taking τ 0 and v , such that τ v 2 D , we recover the diffusion equation: t D x 2 ψ W x , t = 0 , where its solution is given by the Wiener process: ψ W x , t = exp x 2 / 4 D t π 4 D t .
In the following sections, we will present a Shannon entropic analysis [25] for the TE. Fisher’s information [26] is naturally linked to variations of entropy; therefore, it can be related to the control of disorder in a transport process. Fisher’s measure of indeterminacy has several natural and important applications in the design of codes and protocols, biophysics transport, machine learning, etc. In fact, this measure tells us how much information one (input) parameter carries about another (output) value, and so Fisher’s information is widely used in optimal experimental design in many areas of research [27,28]. In the next sections, the analysis of the Shannon and Fisher functionals will be presented for the finite-velocity diffusion process (hyperbolic diffusion).
The paper is organized as follows: In Section 2, we present hyperbolic diffusion as a novel integro-differential equation for a non-Markov diffusion process. In Section 3, we present, for the first time, to our knowledge, the Fisher and Shannon functionals for the hyperbolic diffusion problem; in the Markovian limit, Fisher and Shannon functionals for Wiener’s diffusion are recovered. In Section 4, we calculate Shannon’s entropy, Fisher’s information, the complexity measure, and the Cramér–Rao bound. All of these novel results are obtained by solving numerically the TE (1). In Section 5, we present conclusions on the present approach, as well as its future extensions and applications. In addition, Appendix A, Appendix B and Appendix C are used to present the algebra necessary to prove the diffusion convergence, introduce a novel toy model for short times, and develop a perturbation theory for the calculation of Shannon’s entropy at long times.

2. Hyperbolic Diffusion

In the following, we are concerned with the hyperbolic diffusion problem. Therefore, in the present work, we are interested in solutions to (1) under the following conditions:
ψ x , t 0 , d x ψ x , t = 1 ,
and we look for a solution that fulfills the particular initial conditions:
ψ x , t t = 0 = δ x , t ψ x , t t = 0 = 0 ,
where δ x is the Dirac delta function. An analytic solution can straightforwardly be obtained by working with Equation (1) in the Fourier–Laplace representation; see Appendix A. Other initial conditions, such as a “bullet packet”, can also be worked out, but will not be considered in the present work.
It is interesting to note that the TE can also be written as an integral operator:
t ψ x , t = v 2 0 t e t t / τ x 2 ψ x , t d t , t ψ x , t t = 0 = 0 .
We note that, from this equation, we cannot use a bullet initial condition, because in this case, ψ x , t t = 0 = δ x v t t = 0 and t ψ x , t t = 0 0 .
Equation (4) demonstrates that the solution to the hyperbolic diffusion can be thought of as a non-Markov diffusion process (see chap. 7 in [29]).

3. Fisher and Shannon Functionals

The Shannon entropy can be defined as follows:
S t = d x ψ x , t ln ψ x , t Δ = d x ψ x , t ln Δ + ln ψ x , t = ln 1 Δ d x ψ x , t ln ψ x , t ,
in this formula, we use normalization: d x ψ ( x , t ) = 1 , and explicitly write the small space parameter Δ , which is necessary for the correct definition of Shannon’s entropy for any distribution fulfilling d x ψ x , t = dimensionless. Thus, S t ln 1 Δ Δ S t can be interpreted as a relative entropy to a sharp initial condition at the origin.

3.1. Wiener Diffusion Case

Here, we show a relation between the Shannon and Fisher information functionals for the diffusion process. Applying the time derivative to the definition of Shannon’s entropy of the Wiener process, denoted as S W t , we obtain:
d S W t d t = d d t + d x ψ W x , t ln ψ W x , t . = + d x ln ψ W x , t t ψ W x , t ,
where we use the normalization of ψ W x , t . For the Wiener process, the distribution fulfills the diffusion equation:
t ψ W x , t = D x 2 ψ W x , t ,
then replacing this operator in (6), we obtain the following expression:
d S W t d t = D + d x ln ψ W x , t x 2 ψ W x , t .
Integrating by parts on the right-hand side (RHS) and using ln ψ W x , t x ψ W x , t x ± 0 , we obtain the following:
d S W t d t = D + d x x ψ W x , t x ψ W x , t / ψ W x , t = D + d x x ψ W x , t 2 ψ W x , t .
The RHS expression in (7) is proportional to Fisher’s information:
I W t = x ln ψ W x , t 2 ,
in this case, Fisher’s information concerns the x-parameter, and the symbol represents the mean value over the PDF ψ x , t . Then, we obtain the following:
d S W t d t = D I W t .
Using the Shannon entropy of the Wiener process, S W t = ln 4 π e D t , in (9), we arrive at the following:
I W t = 1 2 D t ,
thus, Fisher’s information for the Wiener process is as follows: I W t = x t 2 1 t 1 . Then, we write a closed expression, connecting Shannon’s entropy and Fisher’s information for the Wiener process:
S W t + 1 2 ln ( I W t ) = 1 2 ln ( 2 π e ) .
After a little algebra, we can write the following:
C W t = e 2 S W t I W t ln ( 2 π e ) = 1 ,
where C W t is the complexity measure. This complexity combines Shannon’s entropy and Fisher’s information for Brownian motion in a simple manner. In general, the complexity measure is defined as follows [27,28]:
C = e 2 S I ln ( 2 π e ) 1 .
In addition, it is simple to see from (10) that the Cramér–Rao bound [30,31] is fulfilled for the Wiener process. Indeed, using I W t = 1 / x t 2 and taking into account the dispersion x t 2 = Δ x t 2 , we can write the following bound:
I W t Δ x t 2 = 1 ,
and so we propose a generic Cramér–Rao lower bound to be used in the TE:
I t Δ x t 2 1 .
The Cramér–Rao inequality is a fundamental tool in information analysis and is widely used in optimal experimental design. Due to the reciprocity of estimator variance and Fisher’s information, minimizing variance corresponds to maximizing information.
In Appendix B, we present the general definition of the θ —parameter Fisher’s information functional.

3.2. Hyperbolic Diffusion Case

Here, we present a relationship between the Shannon and Fisher functionals for the solution to the TE. We will show that there is a non-local connection between Shannon’s entropy and Fisher’s information. This can be seen by considering that the TE also comes from a non-Markov operator. Using (4) in the formula for the time derivative of S TE t , we obtain the following:
d S TE t d t = + d x ln ψ TE x , t t ψ TE x , t = + d x ln ψ TE x , t v 2 0 t d t e t t / τ x 2 ψ TE x , t = v 2 0 t d t e t t / τ + d x x ψ TE x , t ψ TE x , t x ψ TE x , t = v 2 0 t d t e t t / τ I TE t , t ,
the last line is obtained by part integration, and it shows that d S TE t / d t is connected to the non-local Fisher’s information:
I TE t , t + d x x ψ TE x , t ψ TE x , t x ψ TE x , t .
A connection between the non-local x-parameter Fisher’s information and the t-parameter Fisher’s information is demonstrated in Appendix B, as shown in (A18).
As can be seen from (15), only in the limit τ 0 do we recover a local relation (with v , such that τ v 2 D ), then
lim τ 0 τ v 2 0 t d t e t t / τ τ I TE t , t D I W t .
obtaining (9).
The case τ must be taken with care because a wave solution ψ WE x v t intrinsically represents a bullet-like initial condition. Therefore, the surface terms that come from integration by parts cannot be taken as zero. In addition, as time goes on, for a wave packet that moves without deformation, there must not be an increase in disorder or loss of information. To make this analysis, we can take, asymptotically, the limit t / τ 1 , so from (15), we obtain the first-order approximation:
lim t / τ 1 d S TE t d t τ v 2 0 t / τ d z e z I TE t , τ z ,
therefore, in the limit t / τ 1 , we can approximate S TE t as constant in the ballistic regime. See Appendix A.

3.3. Estimation Theory

In many experimental situations, a quantity of interest cannot be measured directly; it can only be measured from a sample of data. This situation is covered by the inference theory, sometimes referred to as the estimation approach [32,33]. Regarding a sample of data of independent and identically distributed (i.i.d.) random variables, the Fisher measure provides an invaluable tool for analyzing this problem, particularly the Cramér–Rao lower bound. For stochastic processes, the situation is much more involved. In particular, in the diffusion case, where the increments of the Wiener process are i.i.d. random variables, the analysis is quite accessible, while in hyperbolic diffusion, where there are strong correlations, it is widely unexplored.
To estimate the finite velocity of diffusion “ v ” in the most precise way possible by sampling from the probability distribution of the hyperbolic diffusion is a very interesting issue. Likelihood estimation in the TE is very important in many experimental situations, and we believe that with the help of these ideas, this analysis will be promoted. Nevertheless, we note that this subject is outside the scope of the present work and will be considered in the future.

4. Numerical Results for the Telegrapher’s Equation

Here, we are going to show the numerical results from the direct integration of the TE (1), and for different values of the parameters τ , v . First, we calculate Shannon’s entropy S TE t for the solution to the TE. Unfortunately, we cannot find an analytical expression for the Wiener process, so we perform its calculations numerically. Nevertheless, in Appendix C, we present a cumulant perturbation approach for τ 1 .
In Figure 1, we show Δ S t = S t S ( 0 ) for the TE process and compare this plot against the analytical result for the Wiener case: S W t = ln 4 π e D t . In all cases, we use a thin Gaussian as the initial condition: ψ x , t t = 0 = exp x 2 / 2 σ 2 2 π σ 2 with dispersion σ = 0.07 . The parameters of the TE are τ = 5 and v = 1 , and for the Wiener process, we take D = 5 .
In Figure 2, we show I t versus S t for the solution to the TE. This plot also shows a linear fit (in red) at short times 0 t 10 , while at long times, the behavior is non-linear. The parameters of the TE are as in Figure 1, i.e., τ = 5 and v = 1 .
In Figure 3, we show the complexity measure C t for the solution to the TE. While this measure for the Wiener process is a constant as shown in (11), for the TE, this measure is a non-trivial function of time. Parameters of the TE are as in Figure 1 and Figure 2, i.e., τ = 5 and v = 1 . This complexity C t shows a maximum at time t max 25 . This time is one order of magnitude larger than the relaxation time t Relax 2 τ in the TE; see Appendix A. In Figure 4, we show the complexity measure C t for the TE, with parameters τ = 1 and v = 1 . This complexity C t shows a narrow peak and a maximum at time t max 4 . The time, t max , at which the complexity attains its maximum value, is described by the following relation:
d d t ln 1 I TE t = d d t S TE t .
This implies that when the relative velocity of Fisher’s information, I ˙ t / I = 2 S ˙ t , equals twice the modulus of the Entropy velocity, an observation emerges. In hyperbolic diffusion, the loss of information and the increase in disorder are not equivalent measures. This issue is connected to the fact that the solution of the TE has two quite different behaviors in the short and long times, as seen in Appendix A.2.
In Figure 5a,b, we show the behavior of the complexity C t . That is, we plot the maximum value C t max and the time t max as a function of τ . These values can be fitted by the following expressions:
t max = 0.77549 + 5.1738 τ
C t max = 0.18314 + 8.2431 τ + 70.079 τ 2
For τ 1 , we can use the toy model (A10) to characterize the ballistic regime. Then, it is possible to calculate t max as a function of all the physical parameters. Using (A9) and (A23), we obtain the following:
t max 3 τ + τ log 3 exp 3 + σ 2 / 2 v τ 2 σ 2 / 2 v τ 2
In Figure 6, we show the Cramér–Rao inequality (14) applied to the solution of the TE under the parameters τ = 5 and v = 1 . The plot shows the time-dependent behavior of the product I t K 2 t . It is to be noted that when the initial condition for the solution to ψ x , t is not a delta function, as in (3), the second cumulant K 2 t 0 does not go to zero. In fact, the second cumulant is given by (A28) and goes to a constant for t 0 . In this case, Fisher’s information does not diverge at t = 0 . This issue can be seen in the logarithmic representation in Figure 6 (thus, lim t 0 I t K 2 t 1 ). In the same Figure, we plot the Cramér–Rao inequality, calculated analytically with our toy model based on the approximation (A10), which is valid for τ 1 . The Cramér–Rao inequality tends, for t , to the lower bound o n e , corresponding to the Wiener process, as shown in (13).
We note that the convergence in time from the solution to the TE toward the Wiener process can also be proved by calculating the time-dependent kurtosis, as shown in Appendix A.3. In addition, in the limit τ 1 , a perturbation theory can be presented in terms of all cumulants of ψ x , t . Therefore, for example, Shannon’s entropy for the TE can be calculated, as shown in Appendix C.

5. Conclusions

The Wiener process is ubiquitous in nature because it is a time-dependent Gaussian process. For the Wiener process, the complexity C and the Cramér-Rao bound are well known, while if the diffusion process has a finite velocity of propagation, the situation is less known. In the present work, we studied the time-dependent Fisher I TE t and Shannon S TE t functionals associated with the hyperbolic diffusion, as characterized by the telegrapher’s equation. The solution to the telegrapher’s equation shows ballistic behavior at a short time ( t τ ), while at a long time ( t τ ), the behavior is diffusive, and the crossover between both regimes is of the order of the relaxation time, τ . Therefore, it is important to know how information measures behave in a hyperbolic diffusion protocol. In this context, we characterized the complexity C t of this hyperbolic diffusion process as a function of time, showing that this measure has a maximum at t max before relaxing to a constant value. We also presented the response of this time t max as a function of the parameter τ ( τ 1 is the rate of dissipation). The same occurs with the Cramér–Rao bound connecting Fisher’s information and the spatial uncertainty Δ x t 2 in the process. A relation between a non-local x-parameter Fisher’s information with the local t-parameter Fisher’s information has been established; see (A18), as well as the time-behavior of the Fisher and Shannon functionals for the solution to the telegrapher’s equation. A toy model approximation, for large τ , was used to calculate analytically the Cramér–Rao inequality as a function of time, as well as the relative entropy in the ballistic regime, see Figure 1.
Numerical results have been used to study the Fisher and Shannon functionals as functions of the parameter τ and time t. In addition a perturbation approach (in cumulants series) for small τ for the solution to the hyperbolic diffusion is also presented in Appendix C. This perturbation is a useful tool to analytically approximate many statistical objects in the theory of information.
For many decades, Fisher’s information has been used to find limits on the precision of codes and protocols. On the other hand, the telegrapher’s equation has found applications in many areas of interest where finite-velocity diffusion is the crucial ingredient, such as in engineering code problems, the transport of electromagnetic signals, biophysics of neuronal responses, machine learning, etc. We believe that the present work will stimulate research in the area of the theory of information on hyperbolic diffusion.
Extensions of the present approach can also be conducted when the rate of absorption of energy (characterized by the parameter τ 1 ) has time fluctuations (noise) [11]. In this case, the general interest lies in averaging statistical objects over realizations of disorder (time fluctuations in the rate τ 1 ). Works in this direction are in progress.
Our results can also be extended to consider the issue of random initial conditions in the telegrapher’s equation and to model temperature fluctuations. Therefore, the present approach can help in estimating and statistically inferring physical parameters derived from cosmic microwave background data [17].

Author Contributions

M.O.C.: Conceived and designed the study, performed the analytical calculations, and wrote the paper; F.P. and M.N.: Contributed to the calculating and analysis tools, as well as numerical calculations; M.O.C., F.P. and M.N.: Gave final approval of the version to be published. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author upon reasonable request.

Acknowledgments

M.O.C. acknowledges the funding provided by Secretaría de Ciencia Técnica y Postgrado, Universidad Nacional de Cuyo.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Solution to the Telegrapher’s Equation

To solve the TE, we use the Fourier and Laplace transform:
ψ ( k , s ) = d x e i k x 0 d t e s t ψ ( x , t )
in (1); then using the IC (3), we obtain
ψ ( k , s ) = s + 1 / τ s ( s + 1 / τ ) + v 2 k 2 .
It is possible to rewrite (A2) in the following form:
ψ ( k , s ) = 1 s + D eff ( s ) k 2 ,
where D eff ( s ) = v 2 / ( s + 1 / τ ) is a generalized diffusion coefficient. Notice that from (A2) and for τ s 1 , a wave mode is recovered, i.e., ψ ( k , s ) = s / ( s 2 + v 2 k 2 ) , while in the opposite regimen ( τ s 1 ) , the diffusive mode is recovered ψ ( k , s ) = 1 / ( s + D k 2 ) , with D = D eff ( s ) τ v 2 . Alternatively, solution (A2) can be written as
ψ ( k , s ) = s + 1 / τ ( s s ) ( s s + ) ,
with
s ± = ( 1 ± 1 ( 2 v τ k ) 2 ) / 2 τ .
Therefore, if 2 v τ < | k | 1 , the Fourier mode is localized, while in the opposite case, Fourier modes are delocalized damped waves, i.e., with | k | > k c , where
k c 1 2 v τ ,
thus, k c is a critical Fourier value that characterizes the localized gap [11].

Appendix A.1. The Toy Model for τ ≫ 1

Considering τ 1 as an attenuation parameter of a wave-like equation, we can rewrite (1) as:
t 2 v 2 x 2 ψ x , t = 1 τ t ψ x , t ,
where the left-hand side (LHS) represents the wave equation. So, if the parameter ( τ ) is larger compared to the other time scales of the system, it is possible to write the following: ψ ( x , t ) e t / 2 τ f ( x v t ) , where f ( x v t ) represents the solution to the wave equation (LHS in (A7)); For example, consider a packet moving to the right; we can write an attenuated wave as follows:
ψ AW ( x , t ) e x / L g ( x v t ) ,
where there is a simple transformation between g ( x v t ) and f ( x v t ) . Therefore, L = 2 v τ = k c 1 is the space attenuation length.
The Fisher’s information ( I t = + d x x ψ x , t 2 / ψ x , t ) ) for the attenuated wave (A8) is as follows:
I AW t = 1 σ 2 exp t 2 τ + σ 2 8 v 2 τ 2 ,
where we use the toy model:
ψ AW x , t e x / L exp ( x t v ) 2 / 2 σ 2 / 2 π σ 2 .
Here, L = k c 1 and σ is a constant characterizing the width in the initial condition (A26). We neglect the initial condition: t ψ AW x , t t = 0 = 0 implicit in (4).
Using (A28), it is possible to obtain the Cramér–Rao measure for the toy model (see Figure 6).

Appendix A.2. The Exact Solution in Real Space-Time

The solution to (1) is obtained by first inverting the Fourier transform of (A2), leading to:
ψ x , s = 1 2 v s + 1 / τ s 2 + s / τ exp x v s 2 + s / τ .
Now, introducing the inverse Laplace transform, we are in accordance with (3) [34,35,36,37]:
ψ x , t = e t / 2 τ 2 δ x v t + δ x + v t + e t / 2 τ 4 v τ I 0 z + I 1 z 2 z τ t Θ v t x ,
where I 0 z and I 1 z are Bessel functions [38], with
z ( v t ) 2 x 2 2 v τ ,
and Θ y is the Heaviside function: Θ y = 0 if y < 0 and Θ y = 1 if y > 0 .

Appendix A.3. Convergence to the Gaussian Diffusion for t/τ ≫ 1

From Formula (A12), it is possible to see that for τ , the behavior is ballistic, while if τ 0 and v with τ v 2 D , the behavior is diffusive. This conclusion can also be seen when studying the time-dependent kurtosis from the n —derivative of the characteristic function, showing that:
lim t K t = x t 4 / x t 2 2 3 .
All time-dependent moments M 2 m t x t 2 m can easily be calculated from the characteristic function ψ k , t :
M m t = m ψ k , t i k m k = 0 ,
where ψ k , t is given by (A24) when using a delta IC, as in (3).
Formula (A15) can be used if the IC is not symmetric or when the IC is not sharp. See Appendix C.

Appendix B. On Non-Local Fisher’s Information ITE(t,t′)

The non-local x-parameter Fisher’s information (16) can be related to Fisher’s information with respect to the time parameter in the following form: by taking the second time derivative of Shannon’s entropy and integrating by parts, we obtain the following:
d 2 S TE t d t 2 = d x t ψ TE x , t 2 ψ TE x , t + d x ln ψ TE x , t t 2 ψ TE x , t .
From (1), t 2 ψ x , t = v 2 x 2 1 τ t ψ x , t , and introducing this operator into (A16), we finally obtain the following:
d 2 S TE ( t ) d 2 t + 1 τ d S TE ( t ) d t = v 2 I TE ( t ) + d x t ψ TE x , t 2 ψ TE x , t .
Now, taking the time derivative in (15) and comparing it with (A17), we obtain the following identity:
v 2 0 t d t e t t / τ t I TE t , t = + d x t ψ TE x , t 2 ψ TE x , t .
This establishes a closed connection between the non-local x-parameter Fisher’s information and the local t-parameter Fisher’s information functionals of the TE solution.
Let us write the non-local relation (A18) using the familiar notation for the θ -Fisher’s information:
I θ = θ ln ψ x , θ 2 ψ x , θ d x θ ln ψ x , θ 2 = 2 θ 2 ln ψ x , θ ,
where θ is any parameter of the solution ψ x , t and means the mean value over the PDF ψ x , t . When using the solution to the TE, we can also consider parameters τ , v , t : the relaxation time τ or the dissipative rate τ 1 , the finite velocity of diffusion v, and the time evolution of the process t. Therefore, we can rewrite (A18) for the TE as follows:
v 2 0 t d t e t t / τ t I x t , t = I t t ,
where we use the non-local x-Fisher’s functional I x t , t I TE t , t defined in (16), and the t-Fisher’s information.
The limits τ 0 and v with τ v 2 D correspond to the Wiener case. Then, from (A20), we obtain the following:
D t I x t = I t t ,
as for the usual diffusion process.

Appendix B.1. Relative Entropy from the Toy Model

For t / τ 1 , we can use the toy model (A10) to obtain the t-Fisher’s information:
+ d x t ψ AW x , t 2 ψ AW x , t = σ 2 / τ 2 + 4 v 2 e σ 2 8 τ 2 v 2 t / 2 τ 4 σ 2 .
After using (A9) for the x-Fisher’s information, we solve (A17), and finally obtain the following:
Δ S ( t ) A e t 2 τ B τ e t τ + C .
The variables A, B, and C were obtained using a bi-exponential fit ( A = 4.135 , B = 0.118 , C = 5.17 ); see Figure 1.

Appendix C. Perturbation for Shannon’s Entropy of the Telegrapher’s Equation

From (A4), the inverse Laplace transform gives
ψ k , t = 1 τ e s + t e s t s + s + s + e s + t s e s t s + s ,
with s ± ( k ) given by (A5). Because the solution to the TE is not Gaussian, the cumulants K m t larger than the second one are not zero. Therefore, it is better to have an analytical time expression for the cumulants. These objects can be calculated from the logarithm of (A24) in the following form:
K m t = m ln ψ k , t i k m k = 0 .
When the IC is not sharp, for example, if it is
ψ ( x , t ) t = 0 = exp x 2 / 2 σ 2 2 π σ 2 .
Moments or cumulants can be calculated from the characteristic function:
ψ k , t = e k 2 σ 2 / 2 1 τ e s + t e s t s + s + s + e s + t s e s t s + s .
Therefore, all statistical objects can be obtained in a straightforward fashion. In particular, the first cumulants are
K 2 t = 2 t τ v 2 + 2 τ 2 v 2 e t / τ 1 + σ 2
K 4 t = τ 4 v 4 60 12 e 2 t / τ 48 e t / τ t τ 3 v 4 24 + 48 e t / τ
K 6 t = τ 6 v 6 240 e 3 t / τ + 1440 e 2 t / τ + 3600 e t / τ 5280 + t τ 5 v 6 1440 + 1440 e 2 t / τ + 4320 e t / τ + 1440 t 2 τ 4 v 6 e t / τ
K 8 t = 53760 t 3 τ 5 v 8 e t / τ τ 8 v 8 10080 e 4 t / τ + 80640 e 3 t / τ + 282240 e 2 t / τ + 564480 e t / τ 937440 t τ 7 v 8 80640 e 3 t / τ + 403200 e 2 t / τ + 725760 e 3 t / τ + 201600 t 2 τ 6 v 8 161280 e 2 t / τ + 322560 e t / τ .
As expected, we note that for τ 0 , all cumulants K 2 m t 0 , m 2 , and only K 2 t 2 D t + σ 2 survives (if v with τ v 2 D ). Having all cumulants at hand and noting that K 2 m + 1 t = 0 , we now introduce a cumulant series expansion for the probability in the following form (the cumulant expansion can be seen in Chapter 1 of [29]):
ψ x , t = + exp m = 1 i k 2 m K 2 m t 2 m ! e i k x d k 2 π = + exp k 2 K 2 t 2 ! exp m = 2 i k 2 m K 2 m t 2 m ! e i k x d k 2 π + e k 2 K 2 t 2 1 + m = 2 i k 2 m K 2 m t 2 m ! + 1 2 ! m = 2 2 + e i k x d k 2 π P 2 x , t + + e k 2 K 2 t 2 k 4 K 4 t 4 ! k 6 K 6 t 6 ! + k 8 K 8 t 8 ! + K 4 2 t 2 ! 4 ! 2 + e i k x d k 2 π P 2 x , t + K 4 t 4 ! P 4 x , t K 6 t 6 ! P 6 x , t + K 8 t 8 ! + K 4 2 t 2 ! 4 ! 2 P 8 x , t + ,
where P 2 x , t = exp x 2 / 2 K 2 t π 2 K 2 t is a Gaussian PDF with dispersion K 2 t given by (A28), and the perturbations are characterized by
P 2 m x , t = 1 2 π + k 2 m e k 2 K 2 t / 2 e i k x d k = exp x 2 / 2 K 2 t i 2 m 2 π 2 K 2 t m 1 K 2 t H 2 m x 2 K 2 t = P 2 x , t i 2 m 2 K 2 t m H 2 m x 2 K 2 t .
Here, H 2 m z represents the Hermite polynomial [38].
Now, we introduce the following notation:
P ˜ 4 x , t i 4 K 4 t 4 ! P 4 x , t P ˜ 6 x , t i 6 K 6 t 6 ! P 6 x , t P ˜ 8 x , t i 8 K 8 t 8 ! + K 4 2 t 2 ! 4 ! 2 P 8 x , t .
Therefore, we can use a finite approximation: ψ x , t n P 2 n x , t to calculate Shannon’s entropy, and the other measures of information theory applied to the TE. We note that we must be careful when cutting this expansion due to the occurrence of negative tails [29], see Marcinkiewicz’s theorem [39]. From (A32), it is interesting to note that for small τ , the first contribution P 2 x , t is dominant, and corrections P ˜ 2 m x , t are of small intensities (preserving the Wiener structure). In the opposite case, for large τ , the contributions P ˜ 2 m x , t are important corrections of larger intensities (killing the Gaussian term: P 2 x , t ).
We note that cumulants K 2 m t and functions P 2 m x , t can take positive and negative values. In addition, from (A33) and (A34), P ˜ 2 m x , t is an even function, fulfilling P ˜ 2 m x , t d x = dimensionless and P ˜ 2 m x , t d x = 0 ; this last assertion can be proved using properties of the Hermite polynomial [38]. Thus, we see that to any order in the series (A32), we obtain the following: + ψ x , t d x = 1 . Noting that K 2 m t t τ 2 m 1 , m > 1 , we can write a series expansion for Shannon’s entropy.
Using function (A34) in (5), we can write the following, up to the first corrections in the small parameter τ :
S TE t = d x ψ x , t ln ψ x , t Δ d x P 2 x , t + P ˜ 4 x , t + ln P 2 x , t + P ˜ 4 x , t + P ˜ 6 x , t + Δ d x P 2 x , t + P ˜ 4 x , t + ln P 2 x , t Δ d x P 2 x , t + P ˜ 4 x , t + ln 1 + P ˜ 4 x , t P 2 x , t + P ˜ 4 x , t P 2 x , t + S 2 t + ln 1 Δ d x P 2 x , t + P ˜ 4 x , t + P ˜ 4 x , t P 2 x , t + P ˜ 6 x , t P 2 x , t + ln π e 2 K 2 t + ln 1 Δ d x P ˜ 4 x , t 2 P 2 x , t d x P ˜ 6 x , t 2 P 2 x , t + .
Therefore, using (A33) and (A34), performing the integrals, and collecting terms, we obtain a perturbative formula for the relative entropy that looks like:
Δ S TE t ln π e 2 K 2 t 1 24 K 4 t 2 K 2 t 4 1 720 K 6 t 2 K 2 t 6

References

  1. Pearson, J.M. A Theory of Waves; Allyn and Bacon, Inc.: Boston, MA, USA, 1966; Chapter 1.7. [Google Scholar]
  2. Landau, L.D.; Bell, J.S.; Kearsley, M.J. Electrodynamics of Continuous Media; Elsevier Science: Amsterdam, The Netherlands, 2013; ISBN 9781483293752. Chapter 71. [Google Scholar]
  3. Heaviside, O. Electrical Papers of Oliver Heaviside; Chelsea: New York, NY, USA, 1970; Volume 1, p. 307. [Google Scholar]
  4. Nagy, G.B.; Ortiz, O.E.; Reula, O.A. The behavior of hyperbolic heat equations’ solutions near their parabolic limits. J. Math. Phys. 1994, 35, 4334. [Google Scholar] [CrossRef]
  5. Compte, A.; Metzlerz, R. The generalized Cattaneo equation for the description of anomalous transport processes. J. Phys. A Math. Gen. 1997, 30, 7277–7289. [Google Scholar] [CrossRef]
  6. Górska, K.; Horzela, A.; Lenzi, E.K.; Pagnini, G.; Sandev, T. Generalized Cattaneo (telegrapher’s) equations in modeling anomalous diffusion phenomena. Phys. Rev. E 2020, 102, 022128. [Google Scholar] [CrossRef]
  7. Bear, M.F.; Connors, B.W.; Paradiso, M.A. Neuroscience: Exploring the Brain, 4th ed.; Wolters Kluwer: New York, NY, USA, 2016. [Google Scholar]
  8. Pettersen, K.H.; Einevoll, G.T. Neurophysics: What the telegrapher’s equation has taught us about the brain. In An Anthology of Developments in Clinical Engineering and Bioimpedance: Festschrift for Sverre Grimnes; Martinsen, ø., Jensen, ø., Eds.; Unipub Forlag: Oslo, Norway, 2009. [Google Scholar]
  9. Graaff, R.; Hoenders, B.J. Telegrapher’s Equation for Light Transport in Tissue with Substantial Absorption. In Proceedings of the Biomedical Optics 2008, St. Petersburg, FL, USA, 16–19 March 2008. [Google Scholar] [CrossRef]
  10. Shlepnev, Y. Coupled 2D telegrapher’s equations for PDN analysis. In Proceedings of the 2012 IEEE 21st Conference on Electrical Performance of Electronic Packaging and Systems, Tempe, AZ, USA, 21–24 October 2012; pp. 171–174. [Google Scholar] [CrossRef]
  11. Nizama, M.; Caceres, M.O. Penetration of waves in global stochastic conducting media. Phys. Rev. E 2023, 107, 054107. [Google Scholar] [CrossRef]
  12. Heizler, S.I. Asymptotic telegrapher’s equation (P1) approximation for the transport equation. Nucl. Sci. Eng. 2010, 166, 17–35. [Google Scholar] [CrossRef]
  13. Cáceres, M.O.; Wio, H.S. Non-Markovian diffusion-like equation for transport processes with anisotropic scattering. Physica A 1987, 142, 563. [Google Scholar] [CrossRef]
  14. Cáceres, M.O.; Nizama, M. Stochastic telegrapher’s approach for solving the random Boltzmann-Lorentz gas. Phys. Rev. E 2022, 2022 105, 044131. [Google Scholar] [CrossRef]
  15. Ureña, F.; Gavete, L.; Benito, J.J.; García, A.; Vargas, A.M. Solving the telegraph equation in 2-D and 3-D using generalized finite difference method (GFDM). Eng. Anal. Boundary Elem. V. 2020, 112, 13–24. [Google Scholar] [CrossRef]
  16. Broadbridge, P.; Kolesnik, A.D.; Leonenko, N.; Olenko, A. Random spherical hyperbolic diffusion. J. Stat. Phys. 2019, 177, 889–916. [Google Scholar] [CrossRef]
  17. Broadbridge, P.; Kolesnik, A.D.; Leonenko, N.; Olenko, A.; Omari, A.D. Spherically restricted random hyperbolic diffusion. Entropy 2020, 22, 217. [Google Scholar] [CrossRef] [PubMed]
  18. Cáceres, M.O. Finite-velocity diffusion in random media. J. Stat. Phys. 2020, 179, 729–747. [Google Scholar] [CrossRef]
  19. Sandev, T.; Iomin, A. Finite-velocity diffusion on a comb. Europhys. Lett. 2018, 124, 20005. [Google Scholar] [CrossRef]
  20. Keller, J.B. Diffusion at finite speed and random walks. Proc. Nac. Acd. Sci. USA 2004, 101, 1120–1122. [Google Scholar] [CrossRef] [PubMed]
  21. Cáceres, M.O. Comments on wave-like propagation with binary disorder. J. Stat. Phys. 2021, 182, 36. [Google Scholar] [CrossRef]
  22. Cáceres, M.O. Surface gravity waves on randomly irregular floor and the telegrapher’s equation. AIP Adv. 2021, 11, 045218. [Google Scholar] [CrossRef]
  23. Marín, E.; Vaca-Oyola, L.S.; Delgado-Vasallo, O. On thermal waves’ velocity: Some open questions in thermal waves’ physics. Rev. Mex. Fis. E 2016, 62, 1–4. [Google Scholar]
  24. Marin, E. On thermal waves. Eur. J. Phys. 2013, 34, L83–L85. [Google Scholar] [CrossRef]
  25. Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  26. Fisher, R.A. Statistical Methods and Scientific Inference, 2nd ed.; Oliver and Boyd: London, UK, 1959; Chapter IV. [Google Scholar]
  27. Szabó, J.B.; Sen, K.D.; Nagy, Á. The Fisher-Shannon information plane for atoms. Phys. Lett. A 2008, 372, 2428. [Google Scholar] [CrossRef]
  28. Dehesa, J.S.; López-Rosaa, S.; Manzano, D. Configuration complexities of hydrogenic atoms. Eur. Phys. J. D 2009, 55, 539. [Google Scholar] [CrossRef]
  29. Cáceres, M.O. Non-Equilibrium Statistical Physics with Application to Disordered Systems; Springer: Berlin, Germany, 2017; ISBN 978-3-319-51552-6. [Google Scholar]
  30. Cramér, H. Mathematical Methods of Statistics; Princeton University Press: Princeton, NJ, USA, 1946; ISBN 0-691-08004-6. Chapter 32. [Google Scholar]
  31. Rao, C.R.; Das Gupta, S. (Eds.) Selected Papers of C. R. Rao; Wiley: New York, NY, USA, 1994; ISBN 978-0-470-22091-7. [Google Scholar]
  32. Radaelli, M.; Landi, G.T.; Modi, K.; Binder, F.C. Fisher information of correlated stochastic processes. New J. Phys. 2023, 25, 053037. [Google Scholar] [CrossRef]
  33. Yoshida, N. Statistical inference for stochastic processes: Concepts and developments in asymptotic theory. In Stochastic Processes and Applications to Mathematical Finance, Proceedings of the March 3rd-7th 2004 Symposium at Ritsumeikan University BKC; Biwako Kusatsu Campus, Ritsumeikan University: Kyoto, Japan, 2004. [Google Scholar]
  34. Morse, P.M.; Feshbach, H. Methods of Theoretical Physics; McGraw-Hill Book Company, Inc.: New York, NY, USA, 1953; p. 865. [Google Scholar]
  35. Kac, M. A Stochastic model related to the telegraphers equation. Rocky Mt. J. Math. 1974, 4, 497. [Google Scholar] [CrossRef]
  36. Masoliver, J.; Weiss, G.H. Finite-velocity diffusion. Eur. J. Phys. 1996, 17, 190. [Google Scholar] [CrossRef]
  37. Sonnenschein, E. Wave packets and group velocity in absorbing media: Solutions of the telegrapher’s equation. Prog. Electromagn. Res. PIER 2000, 27, 129–158. [Google Scholar] [CrossRef]
  38. Spanier, J.; Oldham, K. An Atlas of Functions; Springer: Berlin, Germany, 1987. [Google Scholar]
  39. Marcinkiewicz, J. Sur une propriété de la loi de Gauβ. Math. Z. 1939, 44, 612. [Google Scholar] [CrossRef]
Figure 1. Δ S t as a function of time for a Wiener process (red dashed line), and the solution to the TE (full black line). The blue circle line represents Δ S t from the toy model solution to the TE, the fit is valid for t / τ 1 ; see Appendix B.1. The parameters are τ = 5 , v = 1 .
Figure 1. Δ S t as a function of time for a Wiener process (red dashed line), and the solution to the TE (full black line). The blue circle line represents Δ S t from the toy model solution to the TE, the fit is valid for t / τ 1 ; see Appendix B.1. The parameters are τ = 5 , v = 1 .
Entropy 25 01627 g001
Figure 2. Fisher’s information as a function of entropy for the solution to the TE (black circle). The linear fit (for some values of t 10 ) is denoted by the dashed red line. The parameters are τ = 5 , v = 1 . The equation of linear fit is I ( S ) = 41.57 S + 166 .
Figure 2. Fisher’s information as a function of entropy for the solution to the TE (black circle). The linear fit (for some values of t 10 ) is denoted by the dashed red line. The parameters are τ = 5 , v = 1 . The equation of linear fit is I ( S ) = 41.57 S + 166 .
Entropy 25 01627 g002
Figure 3. Complexity C t as a function of time for the solution to the TE (full black line). The inset corresponds to the Wiener process (red dashed line). The parameters are τ = 5 , v = 1 .
Figure 3. Complexity C t as a function of time for the solution to the TE (full black line). The inset corresponds to the Wiener process (red dashed line). The parameters are τ = 5 , v = 1 .
Entropy 25 01627 g003
Figure 4. Complexity C t as a function of time for the solution to the TE (full black line). The inset corresponds to the Wiener process (red dashed line). The parameters are τ = 1 , v = 1 .
Figure 4. Complexity C t as a function of time for the solution to the TE (full black line). The inset corresponds to the Wiener process (red dashed line). The parameters are τ = 1 , v = 1 .
Entropy 25 01627 g004
Figure 5. Characteristics of the complexity measure C t . (a) Maximum time t max as a function of τ . (b) Maximum value for the complexity as a function of τ .
Figure 5. Characteristics of the complexity measure C t . (a) Maximum time t max as a function of τ . (b) Maximum value for the complexity as a function of τ .
Entropy 25 01627 g005
Figure 6. Cramér–Rao inequality as a function of time for the solution to the TE (full black line with circles), and the attenuated wave: ψ ( x , t ) e x / L exp ( x t v ) 2 / 2 σ 2 / 2 π σ 2 , (toy model (A10) in red dashed line). The vertical blue dashed line represents the location of the t max given by our formula (22). The parameters are τ = 5 , v = 1 , σ = 0.07 .
Figure 6. Cramér–Rao inequality as a function of time for the solution to the TE (full black line with circles), and the attenuated wave: ψ ( x , t ) e x / L exp ( x t v ) 2 / 2 σ 2 / 2 π σ 2 , (toy model (A10) in red dashed line). The vertical blue dashed line represents the location of the t max given by our formula (22). The parameters are τ = 5 , v = 1 , σ = 0.07 .
Entropy 25 01627 g006
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cáceres, M.O.; Nizama, M.; Pennini, F. Fisher and Shannon Functionals for Hyperbolic Diffusion. Entropy 2023, 25, 1627. https://doi.org/10.3390/e25121627

AMA Style

Cáceres MO, Nizama M, Pennini F. Fisher and Shannon Functionals for Hyperbolic Diffusion. Entropy. 2023; 25(12):1627. https://doi.org/10.3390/e25121627

Chicago/Turabian Style

Cáceres, Manuel O., Marco Nizama, and Flavia Pennini. 2023. "Fisher and Shannon Functionals for Hyperbolic Diffusion" Entropy 25, no. 12: 1627. https://doi.org/10.3390/e25121627

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop