Next Article in Journal
Bipolar Intuitionistic Fuzzy Soft Ideals of BCK/BCI-Algebras and Its Applications in Decision-Making
Previous Article in Journal
Analysis of Within-Host Mathematical Models of Toxoplasmosis That Consider Time Delays
Previous Article in Special Issue
A K-SVD Based Compressive Sensing Method for Visual Chaotic Image Encryption
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Fractional-Order Memristive Two-Neuron-Based Hopfield Neuron Network: Dynamical Analysis and Application for Image Encryption

by
Jayaraman Venkatesh
1,
Alexander N. Pchelintsev
2,*,
Anitha Karthikeyan
3,4,
Fatemeh Parastesh
5 and
Sajad Jafari
6,7
1
Center for Artificial Intelligence, Chennai Institute of Technology, Chennai 600069, Tamil Nadu, India
2
Department of Higher Mathematics, Tambov State Technical University, Sovetskaya Str. 106, 392000 Tambov, Russia
3
Department of Electronics and Communication Engineering, Vemu Institute of Technology, Chithoor 517112, Andhra Pradesh, India
4
Department of Electronics and Communications Engineering and University Centre for Research & Development, Chandigarh University, Mohali 140413, Punjab, India
5
Centre for Nonlinear Systems, Chennai Institute of Technology, Chennai 600069, Tamil Nadu, India
6
Health Technology Research Institute, Amirkabir University of Technology (Tehran Polytechnic), Tehran 15916-34311, Iran
7
Department of Biomedical Engineering, Amirkabir University of Technology (Tehran Polytechnic), Tehran 15916-34311, Iran
*
Author to whom correspondence should be addressed.
Mathematics 2023, 11(21), 4470; https://doi.org/10.3390/math11214470
Submission received: 18 September 2023 / Revised: 14 October 2023 / Accepted: 25 October 2023 / Published: 28 October 2023
(This article belongs to the Special Issue Chaotic Systems and Their Applications)

Abstract

:
This paper presents a study on a memristive two-neuron-based Hopfield neural network with fractional-order derivatives. The equilibrium points of the system are identified, and their stability is analyzed. Bifurcation diagrams are obtained by varying the magnetic induction strength and the fractional-order derivative, revealing significant changes in the system dynamics. It is observed that lower fractional orders result in an extended bistability region. Also, chaos is only observed for larger magnetic strengths and fractional orders. Additionally, the application of the fractional-order model for image encryption is explored. The results demonstrate that the encryption based on the fractional model is efficient with high key sensitivity. It leads to an encrypted image with high entropy, neglectable correlation coefficient, and uniform distribution. Furthermore, the encryption system shows resistance to differential attacks, cropping attacks, and noise pollution. The Peak Signal-to-Noise Ratio (PSNR) calculations indicate that using a fractional derivative yields a higher PSNR compared to an integer derivative.
MSC:
26A33; 92B20; 65P20; 34C28

1. Introduction

Recently, the field of neural networks has drawn growing interest due to their potential applications in various domains, including pattern recognition, optimization, and associative memory [1,2]. The Hopfield neural network (HNN) is a common and extensively studied neural network model [3]. HNNs are recurrent artificial neural networks possessing associative memory properties. They have been widely used for solving optimization problems and pattern recognition [4,5,6]. The fundamental building block of a Hopfield neural network is the neuron that processes and transmits the information. A large number of HNNs with different numbers of neurons and various activation functions have been proposed [7,8,9,10]. While most existing models are based on four and three neurons, a few recent studies have presented models based on only two neurons. The two-neuron models provide advantages such as simplicity and reduced computational complexity, while the three-neuron models have higher capacity and may result in improved pattern separation. The choice of the model depends on the specific problem being addressed and the desired functionality.
The advancements in nanotechnology have resulted in the foundation of a new electronic component called the memristor [11]. Memristors are passive circuit elements with memory-dependent resistance properties. Hence, they can be applied to emulating synaptic connections between neurons [12]. Incorporating memristors in NNs can increase their ability to process and store information, emulating biological functions [13,14]. Therefore, their learning can be enhanced. As a result, in recent years, memristive neural networks have been proposed as an alternative to the traditional neural network models [15,16,17,18].
Furthermore, studies have shown that the application of fractional-order derivatives instead of integer-order ones can lead to more complex dynamics [19]. Fractional calculus provides a powerful mathematical framework for describing systems with memory and long-range dependencies [20], considering that fractional-order derivatives can improve the network’s ability to model and process complex patterns. In other words, more accurate models can be obtained using fractional derivatives, describing the behaviors precisely [21,22]. Subsequently, some studies have focused on analyzing HNNs using fractional derivatives [23,24,25]. For example, Xu et al. [26] presented a new fractional-order chaotic system based on a four-neuron-based HNN, and its rich dynamics was demonstrated. Rajagopal et al. [27] investigated a memristive fractional-order HNN and represented hyperchaotic attractors in fractional orders with hidden oscillations. The stability and synchronization of fractional-order HNNs were analyzed in [28]. In this study, a Mittag–Leffler stability criterion in the form of linear matrix inequalities was presented for the fractional-order HNNs.
The inherent properties of chaotic systems, such as sensitivity to initial conditions, unpredictability, and complex behavior, make them suitable for many applications, including image encryption [29,30,31]. These complex dynamics can be useful in generating pseudo-random numbers as encryption keys. These key streams are combined with the original image with operations like XOR or bit-level permutations. Hence, the pixel values are changed in a chaotic manner, recovering, which is difficult for unauthorized individuals without knowledge of encryption keys. The researchers have introduced diverse methods for image encryption based on chaotic systems [32,33,34]. In addition, many studies have shown that using HNNs for image encryption ensures the confidentiality and integrity of images [35,36].
In this paper, a memristive two-neuron HNN model is studied by considering fractional-order derivatives. The dynamics of the model are investigated for different fractional orders as a function of the magnetic induction strength. Bifurcation diagrams and attractors reveal the significant impact of fractional order on the model dynamics. As a common application, the model is used for image encryption. The performance of the encryption system and its robustness to noise are explored. The results show the good performance of the encryption method using the fractional-order model. In addition, the fractional-order model has superiority to the integer-order model in noise robustness. The remainder of this paper is organized as follows: Section 2 provides the introduction of the two-neuron-based HNN and its fractional-order form. Section 3 discusses the simulation results in three subsections. In the first subsection, the equilibrium points are found and stability analysis is carried out. In the second subsection, the dynamics of the model are investigated in detail. Finally, in the last subsection, the application of the model in the image encryption application is studied, and several measures are computed to check its performance. At the end, Section 4 and Section 5 provide a discussion on the results and conclude the paper.

2. Fractional-Order Model

A two-neuron Hopfield neural network with electromagnetic induction was introduced by Chen and Min [37] as follows:
x 1 ˙ = x 1 + w 11 tanh x 1 + w 21 tanh x 2 + k a b ϕ x 1 , x 2 ˙ = x 2 + w 12 tanh x 1 + w 22 tanh x 2 , ϕ ˙ = x 1 ϕ ,
where  k  shows the strength of magnetic induction, and the memductance function is considered as  a b ϕ . The matrix of synaptic weights is as follows:
W = w 11 w 21 w 12 w 22 = 2 1.2 1.2 2
We consider these equations with fractional-order derivatives as
D q x 1 = x 1 + w 11 tanh x 1 + w 21 tanh x 2 + k a b ϕ x 1 , D q x 2 = x 2 + w 12 tanh x 1 + w 22 tanh x 2 , D q ϕ = x 1 ϕ ,
where  D q  denotes the Caputo fractional operator.
The Riemann–Liouville (RL) fractional integral operator of order  q 0  of a function  f ( t )  is defined as
I q f t = 1 Γ ( q ) 0 t f ( τ ) t τ 1 q d τ , t > 0 , q > 0
with  Γ  showing the well-known gamma function. The Caputo fractional differential operator is as follows:
D q f t = I n q D n f t = 1 Γ ( n q ) 0 t f n ( τ ) t τ q + 1 n d τ , i f n 1 < q < n d n f ( t ) d t n , α = n N
To obtain the numerical solution of the fractional-order equations, the method described in [38] is used.

3. Results

In this section, the equilibrium points of the model are found, and their stability is analyzed. Next, the dynamics of the model are under consideration for different fractional derivatives. Finally, the application of the fractional model for image encryption is shown.

3.1. Stability Analysis

The equilibrium point  X * = ( x 1 * , x 2 * , ϕ * )  of the system can be found by the following equations:
x 1 * + w 11 tanh x 1 * + w 21 tanh x 2 * +   k   a b ϕ * x 1 * = 0 , x 2 * + w 12 tanh x 1 * + w 22 tanh x 2 * = 0 , x 1 * ϕ * = 0 ,
considering the defined parameters results in
x 2 * = atanh x 1 + 2 t a n h x 1 + k 1 4 x 1 x 1 1.2 ,
x 1 * = atanh x 2 * 2 tanh x 2 * 1.2 ,
x 1 * = ϕ * .
Since it is difficult to solve the roots directly, the graphic method is adopted. The two first Equations (7a) and (7b) are plotted and shown in Figure 1a. Note that the solution of the second equation is not dependent on  k  and is shown by blue color. The solution of the first equation is shown by different colors for three  k  values as  k = 0.8 ,   1.5 , and  2 . It can be seen that the only intersection of two curves for any  k  value is  x 1 * = 0  and  x 2 * = 0 . Therefore, the system has one equilibrium point as  X * = ( 0 , 0 , 0 ) . To find the stability of the equilibrium point, the Jacobian of system is needed, which, by substituting the equilibrium, is as follows:
J = 1 + k 1.2 0 1.2 1 0 1 0 1 .
The characteristic equation of the Jacobian is as follows:
λ + 1 λ 2 k + 2 λ + 2.44 = 0 ,
leading to the eigenvalues  λ 1 = 1 λ 2,3 = k ( k 2 + 4 k 5.76 ) 2 + 1 . The equilibrium point of a fractional-order system is stable if and only if  arg λ > q π 2 , where  a r g ( λ )  is the principal argument of  λ . The argument of  λ 1  is  π ; therefore, the stability condition is met for this eigenvalue for any  k  value.
The real and imaginary parts of the other two eigenvalues are shown in Figure 1b,c according to  k . In Figure 1d,  arg λ 2,3  is illustrated. For  1.125 < k < 2 , the argument is zero; hence, the condition does not hold. For  0.8 < k < 1.125 , the condition is met only for fractional orders  q < 2 arg λ π , which is very small. Hence, for  0.9 < q < 1 arg λ < q π 2 ,  and the equilibrium point is unstable. Overall, considering  0.9 < q < 1 , the equilibrium point is unstable for all  k  values.

3.2. Dynamical Analysis

The introduced integer model represents rich dynamics by varying the magnetic induction strength ( k ). The bifurcation diagram of the integer-order model as a function of  k  is shown in Figure 2a. It should be noted that for better representation, only the positive peaks are shown in the bifurcations. The orange and blue colors correspond to two initial conditions,  ( 0.1 ,   0 ,   0 )  and  ( 0.1 ,   0 ,   0 ) . It can be observed that the model has coexisting attractors in special values of  k . Next, we change the derivative order to fractional to observe its effects on the bifurcations. The bifurcation diagrams corresponding to  q = 0.98 q = 0.96 ,  and  q = 0.94  are shown in parts b to d of Figure 2. Comparing parts a and b shows that as the derivative order changes to  q = 0.98 , the first chaotic region is extended to larger  k  values. Also, the three-piece chaotic parts are cut in half. Moreover, the periodic region in  2.69 < k < 2.9  has vanished. When the fractional order changes to  0.96  and  0.94 , the first periodic and chaotic region is more extended, and the three-piece chaotic regions are not observable anymore. It is also evident that the bistability occurs in a larger area as the fractional order decreases.
The bifurcation diagram as a function of  q  is shown in Figure 3 for  k = 1.2 ,   1.4 ,   1.6 ,   1.8 ,  and  2 . Note that only the result for the initial condition  ( 0.1 ,   0 ,   0 )  is illustrated. The variation of  q  is considered in the range  [ 0.9   1 ] , below which the dynamics remain periodic. It can be observed that for  k = 1.2  and  1.4 , the dynamics are only periodic and do not change by varying  q . For larger  k  values, the chaotic dynamics emerge for large fractional orders. Furthermore, for larger  k , the chaotic region is extended.
The bifurcation diagrams show that for a fixed  k , varying the fractional order can result in significant changes in dynamics. For example, the attractors of the system for  k = 2.1  and different fractional orders are shown in Figure 4. The integer-order model has a coexistence of period-3 attractors (Figure 4a) which are symmetric. By changing the derivative order to  q = 0.98 , the dynamics change to chaotic and are also symmetric (Figure 4b). For this fractional order, the same attractor attracts both initial conditions. Decreasing the fractional order to  q = 0.96  leads to a symmetric pair of chaotic attractors (Figure 4c). Finally, for  q = 0.94 , two symmetric period-2 attractors emerge (Figure 4d). The time series corresponding to the attractors shown in Figure 4 are demonstrated in Figure 5.
The other important point is that the basins of attraction are changed when the fractional order varies. Figure 6 represents the basin of attraction of two chaotic attractors. The value of the magnetic strength is set to  k = 1.5 ,   1.8   ,   2.1 ,  and  2.5  in parts (a) to (d) for  q = 1 ,   0.98 ,   0.96 , and  0.94 , respectively. The basin of attraction changes remarkably as the fractional order changes. Moreover, for  q = 0.96  and  q = 0.94 , the basins of two attractors are inverse of  q = 0.98  and  q = 1 .

3.3. Application in Image Encryption

To represent the applicability of the fractional model, we use it for the image encryption application. A variety of encryption algorithms have been proposed in recent years [39,40,41]. Here, the encryption method presented in [42,43] is used. In this method, firstly, special initial conditions and parameters are selected to produce chaotic dynamics. Here, the initial condition  ( 0.1 ,   0 ,   0 )  is used. The system is solved, and the time series are obtained. To solve the fractional system, the numerical method presented in [38] is used with a total run time of 500 with time step 0.01. Next, the float values of the time series are converted to 32-bit binary values, where 3 bits correspond to the integer part, and the other 29 bits correspond to the fraction part. To generate the random numbers, the 20 least significant bits are used and put in a vector. By using the randomly generated numbers from  x 1  and  ϕ  variables, the rows and columns of the image are shuffled. Then, the shuffled image is XORed with the XORed random vectors obtained from  x 1  and  x 2  variables. Finally, the results are changed to decimal, and the encrypted image is obtained. The flowchart of the image encryption algorithm is given in Figure 7. For the decryption, the inverse encryption steps are applied.
To represent the power of the fractional-order system in image encryption,  q = 0.98  and  k = 2  are used. The encryption is applied to two images. The original images are shown in Figure 8a,d. The encrypted images are shown in Figure 8b,d, and the decrypted images are represented in Figure 8c,f. It can be observed that the fractional system has enough complexity for image encryption.

3.3.1. Randomness Test

To test the randomness of a pseudo-random sequence, the NIST suite test can be used. This test results in  P v a l u e s , which should be higher than  0.01  for passing the test. The result of this test for the random sequence generated by our algorithm is presented in Table 1. It can be observed that the proposed random sequence has successfully passed all of the tests.

3.3.2. Key Sensitivity Analysis

One of the important factors in the evaluation of encryption is the sensitivity of the system’s keys. To find the key sensitivity, the ciphertext image is decrypted by making a small change to the encrypted key. When the sensitivity is high, it is not possible to recognize the original image. To compare the decrypted image after key variation and the original image, the Mean Squared Error (MSE) is computed:
M S E = 1 M × × N i = 1 M j = 1 N d i j d i j 2 ,
where  M  and  N  represent the length and width of the image, and  d i j  and  d i j  refer to the original and the decrypted images, respectively.
As mentioned, the initial condition  ( 0.1 , 0 , 0 )  and the parameters  q = 0.98  and  k = 2  were chosen as the keys of encryption. In the first test, we change the initial condition to  ( 0.100001 , 0 , 0 )  and perform the decryption. In Figure 9, the decrypted images with the correct key are shown in the left column. The middle column shows the decrypted images with the incorrect initial key. It can be observed that the original image is not obtained. The MSE for the onion image is  5345.45  and for the cameraman image is  7661.16 .
Next, we test the effect of change in the parameters. To this aim, the correct key parameter  k = 2  is changed to  k = 1.99 . The decrypted image with the incorrect parameter key is shown in the right column of Figure 9. It is evident that the decryption has been performed incorrectly. The MSE for the onion and cameraman images is  5430.03  and  7871.31 , respectively.

3.3.3. Statistical Analysis

To investigate the encryption’s performance, firstly, the images’ histograms are obtained and are shown in Figure 10. The histograms of the original images are illustrated in Figure 10a,c, which show distributed colors. The histograms of the encrypted images are shown in Figure 10b,d, which show uniform distributions. Therefore, the fractional system is suitable for encryption. In the next step, the correlation of the color depth of two adjacent pixels is obtained. The correlations of the original images are illustrated in Figure 11a,c, showing a strong correlation between adjacent pixels. The correlations of the encrypted images are shown in Figure 11b,d. It is observed that the correlation of the original images disappears in the encrypted images.
Another commonly used measure is the information entropy. The information entropy represents the randomness of information. The Shannon entropy can be obtained as follows:
H s = 0 N 1 P s i log ( 1 P ( s i ) )
where  N  is the total number of pixels, and  P s i  is the occurrence probability of each grayscale in the image. For larger entropy, the uncertainty is higher. The entropies of the original onion and cameraman images are  7.3325  and  7.0097 , respectively. The entropies of the encrypted images are  7.9973  and  7.9972 , respectively, which show an increment in the entropies.

3.3.4. Differential Analysis

The differential attack refers to detecting the effect of small changes in a pixel of the plain image on the ciphertext image. The Number of Pixels Change Rate (NPCR) and the Unified Average Changing Intensity (UACI) are two measures for testing the differential attack. These measures are calculated as follows:
N P C R = 1 M × N i = 1 M j = 1 N s i g n   x i j × 100 % U A C I = 1 M × N i = 1 M j = 1 N x i j 255 × 100 %
where  x i j = c 1 i , j c 2 ( i , j ) , and  c 1  and  c 2  are the encrypted images before and after changing one pixel, respectively. The NPCR and the UACI computed for the encrypted onion image are  0.9963  and  0.3028 , and for the cameraman image are  0.9962  and  0.3127 , respectively. Hence, the encryption system has strong plaintext sensitivity and can resist differential attacks.

3.3.5. Robustness Analysis

Cropping Attack

A possible occurrence during the transmission is the loss of information. To test the robustness of the encryption system to the information loss, the cropping attack test is performed. To this aim, a part of the encrypted image is cropped, filled with  0  instead, and then decrypted. Figure 12 shows the decryption results of cropped onion images. In parts (a) to (c),  1 / 64 1 / 16 , and  1 / 4  of the image are cropped, respectively. The MSE of the original and decrypted images is  129.89 533.77 , and  2159.08  in parts (a) to (c), respectively. Therefore, the decryption is robust to small information loss. For larger information loss, although the original image is detected, the MSE increases.

Noise Attack

Robustness to noise is of great importance in encryption methods. To evaluate the robustness to noise of the encryption method using the fractional-order model, the salt-and-pepper noise is added to the encrypted image, and then it is decrypted. The decrypted images of the encrypted images with different intensities are shown in Figure 13. It can be seen that the original image can be basically recovered.
Finally, we compare the performance of the fractional-order and integer-order models for image encryption. To this aim, we apply the model with different  k  values for image encryption, and then a salt-and-pepper noise with intensity  0.1  is added to the encrypted image. The Peak Signal-to-Noise Ratio (PSNR) of the decrypted image to the original image is computed. The derivative order is considered to be  q = 1 ,   0.98 ,   0.96 ,  and  0.94 . For each  q , the range of  k  is chosen according to the bifurcation diagram, in which the system represents monostable chaotic dynamics. Figure 14 shows the PSNR as a function of  k  for different derivative orders. It was attained that for the integer-order model, the best PSNR is equal to  18.77 , corresponding to  k = 1.812 . For  q = 0.98 , the best PSNR is obtained by  k = 2.199  as  18.81 . For  q = 0.96  and  q = 0.94 , the higher PSNR are  18.78  and  18.79  by setting  k = 2.597  and  k = 2.817 , respectively. Therefore, the fractional-order systems can result in higher PSNR than the integer-order system.

4. Discussion

In this study, we investigated the behavior of a memristive two-neuron-based Hopfield neural network utilizing fractional-order derivatives. Involving fractional derivatives makes the chaotic models more complex than the integer-order model. This complexity makes it difficult to derive analytical solutions and requires advanced mathematical techniques for modeling and analysis. Our analysis delved into the equilibrium points of the system and examined their stability, offering valuable insights into the network’s dynamics. By constructing various bifurcation diagrams, we shed light on how changes in both magnetic induction strength and fractional order significantly influence the network’s behavior. Notably, we observed that lower fractional orders expand the region of bistability, while chaotic behavior is only observed for higher magnetic strengths and fractional orders.
Moreover, our research explored the practical application of the fractional-order model for image encryption. The results provided compelling evidence for the efficacy of this model in encrypting images. The randomness of the random sequence was tested by the NIST test. Several measures were calculated, and it was shown that the system has a high sensitivity to the key variation and is also robust to information loss. Additionally, we assessed its robustness against noise by introducing salt-and-pepper noise to the encrypted images, and our calculations demonstrated that employing fractional derivatives results in a better Peak Signal-to-Noise Ratio (PSNR) compared to integer derivatives, indicating improved resistance to noise. The encryption results obtained in this paper can be compared to some recent studies given in Table 2. Compared to the other studies, our results are acceptable and better in some cases. Therefore, using the fractional-order HNN in a simple encryption system can provide a good performance.

5. Conclusions

In summary, this study significantly contributes to our understanding of memristive two-neuron-based Hopfield neural networks employing fractional-order derivatives. It underscores the potential applications of fractional derivative systems, particularly in the context of image encryption. Our findings emphasize the importance of incorporating fractional-order derivatives into neural network models, as they exert a substantial influence on system dynamics and performance.
We acknowledge certain limitations in this study, which are pivotal in contextualizing our findings. Our research predominantly focuses on a specific neural network type employing fractional-order derivatives, limiting the generalizability of our results to other neural network architectures. Additionally, our study assumes a simplified model for the neural network, potentially overlooking the complexity inherent in real-world neural networks. To address these limitations, future work should include the analysis of real-world data to validate the practicality and effectiveness of our findings in real-world scenarios. Assessing its performance across a range of applications, including pattern recognition, optimization, and control systems, is also recommended to validate its practical utility.

Author Contributions

Conceptualization, S.J. and F.P.; methodology, F.P. and A.N.P.; software, India, A.N.P.; validation, A.K. and A.N.P.; investigation, A.N.P. and A.K.; writing—original draft preparation, J.V., A.N.P. and A.K.; writing—review and editing, F.P. and S.J.; supervision, S.J.; funding acquisition, J.V. All authors have read and agreed to the published version of the manuscript.

Funding

This work is partially funded by the Center for Artificial Intelligence, Chennai Institute of Technology, India vide funding number CIT/CAI/2023/RP/012.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Abiodun, O.I.; Jantan, A.; Omolara, A.E.; Dada, K.V.; Umar, A.M.; Linus, O.U.; Arshad, H.; Kazaure, A.A.; Gana, U.; Kiru, M.U. Comprehensive review of artificial neural network applications to pattern recognition. IEEE Access 2019, 7, 158820–158846. [Google Scholar] [CrossRef]
  2. Abdolrasol, M.G.; Hussain, S.S.; Ustun, T.S.; Sarker, M.R.; Hannan, M.A.; Mohamed, R.; Ali, J.A.; Mekhilef, S.; Milad, A. Artificial neural networks based optimization techniques: A review. Electronics 2021, 10, 2689. [Google Scholar] [CrossRef]
  3. Wen, U.-P.; Lan, K.-M.; Shih, H.-S. A review of Hopfield neural networks for solving mathematical programming problems. Eur. J. Oper. Res. 2009, 198, 675–687. [Google Scholar] [CrossRef]
  4. Gong, Y.; Xiao, Z.; Tan, X.; Sui, H.; Xu, C.; Duan, H.; Li, D. Context-aware convolutional neural network for object detection in VHR remote sensing imagery. IEEE Trans. Geosci. Remote Sens. 2019, 58, 34–44. [Google Scholar] [CrossRef]
  5. Joya, G.; Atencia, M.; Sandoval, F. Hopfield neural networks for optimization: Study of the different dynamics. Neurocomputing 2002, 43, 219–237. [Google Scholar] [CrossRef]
  6. Lin, H.; Wang, C.; Tan, Y. Hidden extreme multistability with hyperchaos and transient chaos in a Hopfield neural network affected by electromagnetic radiation. Nonlinear Dyn. 2020, 99, 2369–2386. [Google Scholar] [CrossRef]
  7. Njitacke, Z.T.; Kengne, J. Nonlinear dynamics of three-neurons-based Hopfield neural networks (HNNs): Remerging Feigenbaum trees, coexisting bifurcations and multiple attractors. J. Circuits Syst. Comput. 2019, 28, 1950121. [Google Scholar] [CrossRef]
  8. Xu, Q.; Song, Z.; Bao, H.; Chen, M.; Bao, B. Two-neuron-based non-autonomous memristive Hopfield neural network: Numerical analyses and hardware experiments. AEU-Int. J. Electron. Commun. 2018, 96, 66–74. [Google Scholar] [CrossRef]
  9. Chen, C.; Bao, H.; Chen, M.; Xu, Q.; Bao, B. Non-ideal memristor synapse-coupled bi-neuron Hopfield neural network: Numerical simulations and breadboard experiments. AEU-Int. J. Electron. Commun. 2019, 111, 152894. [Google Scholar] [CrossRef]
  10. Huang, Y.; Yang, X.-S. Hyperchaos and bifurcation in a new class of four-dimensional Hopfield neural networks. Neurocomputing 2006, 69, 1787–1795. [Google Scholar]
  11. Strukov, D.B.; Snider, G.S.; Stewart, D.R.; Williams, R.S. The missing memristor found. Nature 2008, 453, 80–83. [Google Scholar] [CrossRef] [PubMed]
  12. Lin, H.; Wang, C.; Sun, Y.; Yao, W. Firing multistability in a locally active memristive neuron model. Nonlinear Dyn. 2020, 100, 3667–3683. [Google Scholar] [CrossRef]
  13. He, S.; Liu, J.; Wang, H.; Sun, K. A discrete memristive neural network and its application for character recognition. Neurocomputing 2023, 523, 1–8. [Google Scholar] [CrossRef]
  14. Thomas, A. Memristor-based neural networks. J. Phys. D Appl. Phys. 2013, 46, 093001. [Google Scholar] [CrossRef]
  15. Bao, B.; Qian, H.; Xu, Q.; Chen, M.; Wang, J.; Yu, Y. Coexisting behaviors of asymmetric attractors in hyperbolic-type memristor based Hopfield neural network. Front. Comput. Neurosci. 2017, 11, 81. [Google Scholar] [CrossRef]
  16. Zhang, S.; Zheng, J.; Wang, X.; Zeng, Z.; He, S. Initial offset boosting coexisting attractors in memristive multi-double-scroll Hopfield neural network. Nonlinear Dyn. 2020, 102, 2821–2841. [Google Scholar] [CrossRef]
  17. Lin, H.; Wang, C.; Sun, Y. A universal variable extension method for designing multiscroll/wing chaotic systems. IEEE Trans. Ind. Electron. 2023, 1–13. [Google Scholar] [CrossRef]
  18. Lin, H.; Wang, C.; Du, S.; Yao, W.; Sun, Y. A family of memristive multibutterfly chaotic systems with multidirectional initial-based offset boosting. Chaos Solitons Fractals 2023, 172, 113518. [Google Scholar] [CrossRef]
  19. Dalir, M.; Bashour, M. Applications of fractional calculus. Appl. Math. Sci. 2010, 4, 1021–1032. [Google Scholar]
  20. Yang, X.-J. Advanced Local Fractional Calculus and Its Applications; World Science Publisher: New York, NY, USA, 2012. [Google Scholar]
  21. Aguilar, C.Z.; Gómez-Aguilar, J.; Alvarado-Martínez, V.; Romero-Ugalde, H. Fractional order neural networks for system identification. Chaos Solitons Fractals 2020, 130, 109444. [Google Scholar] [CrossRef]
  22. He, S.; Sun, K.; Banerjee, S. Dynamical properties and complexity in fractional-order diffusionless Lorenz system. Eur. Phys. J. Plus 2016, 131, 254. [Google Scholar] [CrossRef]
  23. Ma, C.; Mou, J.; Yang, F.; Yan, H. A fractional-order hopfield neural network chaotic system and its circuit realization. Eur. Phys. J. Plus 2020, 135, 100. [Google Scholar] [CrossRef]
  24. Wang, H.; Yu, Y.; Wen, G. Stability analysis of fractional-order Hopfield neural networks with time delays. Neural Netw. 2014, 55, 98–109. [Google Scholar] [CrossRef]
  25. Zhang, S.; Yu, Y.; Wang, Q. Stability analysis of fractional-order Hopfield neural networks with discontinuous activation functions. Neurocomputing 2016, 171, 1075–1084. [Google Scholar] [CrossRef]
  26. Xu, S.; Wang, X.; Ye, X. A new fractional-order chaos system of Hopfield neural network and its application in image encryption. Chaos Solitons Fractals 2022, 157, 111889. [Google Scholar] [CrossRef]
  27. Rajagopal, K.; Tuna, M.; Karthikeyan, A.; Koyuncu, İ.; Duraisamy, P.; Akgul, A. Dynamical analysis, sliding mode synchronization of a fractional-order memristor Hopfield neural network with parameter uncertainties and its non-fractional-order FPGA implementation. Eur. Phys. J. Spec. Top. 2019, 228, 2065–2080. [Google Scholar] [CrossRef]
  28. Wang, F.; Liu, X.; Tang, M.; Chen, L. Further results on stability and synchronization of fractional-order Hopfield neural networks. Neurocomputing 2019, 346, 12–19. [Google Scholar] [CrossRef]
  29. Hua, Z.; Zhou, Y.; Huang, H. Cosine-transform-based chaotic system for image encryption. Inf. Sci. 2019, 480, 403–419. [Google Scholar] [CrossRef]
  30. Volos, C.K.; Kyprianidis, I.M.; Stouboulos, I.; Pham, V.-T. Image encryption scheme based on non-autonomous chaotic systems. In Computation, Cryptography, and Network Security; Springer: Berlin/Heidelberg, Germany, 2015; pp. 591–612. [Google Scholar]
  31. Wang, L.; Jiang, S.; Ge, M.-F.; Hu, C.; Hu, J. Finite-/fixed-time synchronization of memristor chaotic systems and image encryption application. IEEE Trans. Circuits Syst. I Regul. Pap. 2021, 68, 4957–4969. [Google Scholar] [CrossRef]
  32. Wang, X.; Çavuşoğlu, Ü.; Kacar, S.; Akgul, A.; Pham, V.-T.; Jafari, S.; Alsaadi, F.E.; Nguyen, X.Q. S-box based image encryption application using a chaotic system without equilibrium. Appl. Sci. 2019, 9, 781. [Google Scholar] [CrossRef]
  33. Ma, X.; Wang, C.; Qiu, W.; Yu, F. A fast hyperchaotic image encryption scheme. Int. J. Bifurc. Chaos 2023, 33, 2350061. [Google Scholar] [CrossRef]
  34. Ma, X.; Wang, C. Hyper-chaotic image encryption system based on N+ 2 ring Joseph algorithm and reversible cellular automata. Multimed. Tools Appl. 2023, 82, 38967–38992. [Google Scholar] [CrossRef]
  35. Lai, Q.; Wan, Z.; Zhang, H.; Chen, G. Design and analysis of multiscroll memristive hopfield neural network with adjustable memductance and application to image encryption. IEEE Trans. Neural Netw. Learn. Syst. 2022, 34, 7824–7837. [Google Scholar] [CrossRef]
  36. Wang, X.-Y.; Li, Z.-M. A color image encryption algorithm based on Hopfield chaotic neural network. Opt. Lasers Eng. 2019, 115, 107–118. [Google Scholar] [CrossRef]
  37. Chen, C.; Min, F. Memristive bi-neuron Hopfield neural network with coexisting symmetric behaviors. Eur. Phys. J. Plus 2022, 137, 841. [Google Scholar] [CrossRef]
  38. Diethelm, K.; Freed, A.D. The FracPECE subroutine for the numerical solution of differential equations of fractional order. Forsch. Und Wiss. Rechn. 1998, 1999, 57–71. [Google Scholar]
  39. Li, H.; Wang, L.; Lai, Q. Synchronization of a memristor chaotic system and image encryption. Int. J. Bifurc. Chaos 2021, 31, 2150251. [Google Scholar] [CrossRef]
  40. Xian, Y.; Wang, X. Fractal sorting matrix and its application on chaotic image encryption. Inf. Sci. 2021, 547, 1154–1169. [Google Scholar] [CrossRef]
  41. Gao, X.; Mou, J.; Xiong, L.; Sha, Y.; Yan, H.; Cao, Y. A fast and efficient multiple images encryption based on single-channel encryption and chaotic system. Nonlinear Dyn. 2022, 108, 613–636. [Google Scholar] [CrossRef]
  42. Çavuşoğlu, Ü.; Panahi, S.; Akgül, A.; Jafari, S.; Kacar, S. A new chaotic system with hidden attractor and its engineering applications: Analog circuit realization and image encryption. Analog Integr. Circuits Signal Process. 2019, 98, 85–99. [Google Scholar] [CrossRef]
  43. Farhan, A.K.; Al-Saidi, N.M.; Maolood, A.T.; Nazarimehr, F.; Hussain, I. Entropy analysis and image encryption application based on a new chaotic system crossing a cylinder. Entropy 2019, 21, 958. [Google Scholar] [CrossRef]
  44. Lakshmi, C.; Thenmozhi, K.; Rayappan, J.B.B.; Amirtharajan, R. Hopfield attractor-trusted neural network: An attack-resistant image encryption. Neural Comput. Appl. 2020, 32, 11477–11489. [Google Scholar] [CrossRef]
  45. Gupta, M.; Gupta, K.K.; Shukla, P.K. Session key based fast, secure and lightweight image encryption algorithm. Multimed. Tools Appl. 2021, 80, 10391–10416. [Google Scholar] [CrossRef]
  46. Zhou, W.; Wang, X.; Wang, M.; Li, D. A new combination chaotic system and its application in a new Bit-level image encryption scheme. Opt. Lasers Eng. 2022, 149, 106782. [Google Scholar] [CrossRef]
  47. Chai, X.; Gan, Z.; Yuan, K.; Chen, Y.; Liu, X. A novel image encryption scheme based on DNA sequence operations and chaotic systems. Neural Comput. Appl. 2019, 31, 219–237. [Google Scholar] [CrossRef]
  48. Zefreh, E.Z. An image encryption scheme based on a hybrid model of DNA computing, chaotic systems and hash functions. Multimed. Tools Appl. 2020, 79, 24993–25022. [Google Scholar] [CrossRef]
  49. Li, X.; Mou, J.; Xiong, L.; Wang, Z.; Xu, J. Fractional-order double-ring erbium-doped fiber laser chaotic system and its application on image encryption. Opt. Laser Technol. 2021, 140, 107074. [Google Scholar] [CrossRef]
  50. Chen, C.; Zhu, D.; Wang, X.; Zeng, L. One-dimensional quadratic chaotic system and splicing model for image encryption. Electronics 2023, 12, 1325. [Google Scholar] [CrossRef]
  51. Yan, M.; Xie, J. A conservative chaotic system with coexisting chaotic-like attractors and its application in image encryption. J. Control Decis. 2023, 10, 237–249. [Google Scholar] [CrossRef]
  52. Shakir, H.R.; Mehdi, S.A.; Hattab, A.A. A new four-dimensional hyper-chaotic system for image encryption. Int. J. Electr. Comput. Eng. 2023, 13, 1744. [Google Scholar] [CrossRef]
Figure 1. (a) Solutions of Equations (7a) (red color) and (7b) (blue color) with the intersection (0,0). The first equation is solved for  k = 0.8 ,   1.5  and  2  and shown by different red tones. (b) The real part of the eigenvalues  λ 2 , λ 3 . (c) The imaginary part of the eigenvalues  λ 2 , λ 3 . (d) The argument of the eigenvalues  λ 2 , λ 3 .
Figure 1. (a) Solutions of Equations (7a) (red color) and (7b) (blue color) with the intersection (0,0). The first equation is solved for  k = 0.8 ,   1.5  and  2  and shown by different red tones. (b) The real part of the eigenvalues  λ 2 , λ 3 . (c) The imaginary part of the eigenvalues  λ 2 , λ 3 . (d) The argument of the eigenvalues  λ 2 , λ 3 .
Mathematics 11 04470 g001
Figure 2. (a) Bifurcation diagrams of the system as a function of magnetic coupling strength  k  for different derivative orders. (a q = 1 , (b q = 0.98 , (c q = 0.96 , (d q = 0.94 . The orange and blue colors correspond to two initial conditions,  ( 0.1 ,   0 ,   0 )  and  ( 0.1 ,   0 ,   0 ) .
Figure 2. (a) Bifurcation diagrams of the system as a function of magnetic coupling strength  k  for different derivative orders. (a q = 1 , (b q = 0.98 , (c q = 0.96 , (d q = 0.94 . The orange and blue colors correspond to two initial conditions,  ( 0.1 ,   0 ,   0 )  and  ( 0.1 ,   0 ,   0 ) .
Mathematics 11 04470 g002
Figure 3. Bifurcation diagram of the model as a function of  q  for  k = 1.2 , 1.4 , 1.6 , 1.8 ,  and  2 .
Figure 3. Bifurcation diagram of the model as a function of  q  for  k = 1.2 , 1.4 , 1.6 , 1.8 ,  and  2 .
Mathematics 11 04470 g003
Figure 4. The attractors of the model for  k = 2.1  and two initial conditions  ( ± 0.1 ,   0 ,   0 ) . (a q = 1 , (b q = 0.98 , (c q = 0.96 , (d q = 0.94 .
Figure 4. The attractors of the model for  k = 2.1  and two initial conditions  ( ± 0.1 ,   0 ,   0 ) . (a q = 1 , (b q = 0.98 , (c q = 0.96 , (d q = 0.94 .
Mathematics 11 04470 g004
Figure 5. Time series corresponding to the attractors shown in Figure 4 where  k = 2.1  and (a q = 1 , (b q = 0.98 , (c q = 0.96 , (d q = 0.94 .
Figure 5. Time series corresponding to the attractors shown in Figure 4 where  k = 2.1  and (a q = 1 , (b q = 0.98 , (c q = 0.96 , (d q = 0.94 .
Mathematics 11 04470 g005
Figure 6. Basin of attraction of two chaotic attractors for (a q = 1  and  k = 1.5 , (b q = 0.98  and  k = 1.8 , (c q = 0.96  and  k = 2.1 , (d q = 0.94  and  k = 2.5 .
Figure 6. Basin of attraction of two chaotic attractors for (a q = 1  and  k = 1.5 , (b q = 0.98  and  k = 1.8 , (c q = 0.96  and  k = 2.1 , (d q = 0.94  and  k = 2.5 .
Mathematics 11 04470 g006
Figure 7. Flowchart of the encryption algorithm.
Figure 7. Flowchart of the encryption algorithm.
Mathematics 11 04470 g007
Figure 8. Result of encryption method using the fractional-order system with  k = 2  and  q = 0.98  and the initial condition  ( 0.1 ,   0 ,   0 ) . (a,d) original images, (b,e) the encrypted images, (c,f) the decrypted images.
Figure 8. Result of encryption method using the fractional-order system with  k = 2  and  q = 0.98  and the initial condition  ( 0.1 ,   0 ,   0 ) . (a,d) original images, (b,e) the encrypted images, (c,f) the decrypted images.
Mathematics 11 04470 g008
Figure 9. Result of decryption with the wrong key. The encryption keys are  k = 2  and  q = 0.98  and  ( x 0 , y 0 , z 0 ) = ( 0.1 ,   0 ,   0 ) . (a,d) decrypted images with the correct key, (b,e) the decrypted images with  ( x 0 , y 0 , z 0 ) = ( 0.100001 ,   0 ,   0 ) , (c,f) the decrypted images with  k = 1.99 .
Figure 9. Result of decryption with the wrong key. The encryption keys are  k = 2  and  q = 0.98  and  ( x 0 , y 0 , z 0 ) = ( 0.1 ,   0 ,   0 ) . (a,d) decrypted images with the correct key, (b,e) the decrypted images with  ( x 0 , y 0 , z 0 ) = ( 0.100001 ,   0 ,   0 ) , (c,f) the decrypted images with  k = 1.99 .
Mathematics 11 04470 g009
Figure 10. The histogram of the colors of the original and encrypted images. (a) original onion image, (b) encrypted onion image, (c) original cameraman image, (d) encrypted cameraman image.
Figure 10. The histogram of the colors of the original and encrypted images. (a) original onion image, (b) encrypted onion image, (c) original cameraman image, (d) encrypted cameraman image.
Mathematics 11 04470 g010
Figure 11. The correlation of the color depth of two adjacent pixels. (a) original onion image, (b) encrypted onion image, (c) original cameraman image, (d) encrypted cameraman image.
Figure 11. The correlation of the color depth of two adjacent pixels. (a) original onion image, (b) encrypted onion image, (c) original cameraman image, (d) encrypted cameraman image.
Mathematics 11 04470 g011
Figure 12. The result of the decryption of cropped images. (a) 1/64 of the image is cropped, (b) 1/16 of the image is cropped, (c) 1/4 of the image is cropped.
Figure 12. The result of the decryption of cropped images. (a) 1/64 of the image is cropped, (b) 1/16 of the image is cropped, (c) 1/4 of the image is cropped.
Mathematics 11 04470 g012
Figure 13. The decrypted images from the noisy encrypted images with different intensities. (a,d) noise intensity is 0.05, (b,e) noise intensity is 0.1, (c,f) noise intensity is 0.2.
Figure 13. The decrypted images from the noisy encrypted images with different intensities. (a,d) noise intensity is 0.05, (b,e) noise intensity is 0.1, (c,f) noise intensity is 0.2.
Mathematics 11 04470 g013
Figure 14. Peak Signal-to-Noise Ratio of the decrypted image to the original image for different derivative orders. For each q, a range of k with monostable chaotic dynamics is adopted.
Figure 14. Peak Signal-to-Noise Ratio of the decrypted image to the original image for different derivative orders. For each q, a range of k with monostable chaotic dynamics is adopted.
Mathematics 11 04470 g014
Table 1. Results of the NIST SP 800-22 randomness test.
Table 1. Results of the NIST SP 800-22 randomness test.
Test p v a l u e s Result
Frequency0.73Pass
BlockFrequency0.91Pass
CumulativeSums0.73Pass
Runs0.35Pass
LongestRun0.73Pass
Rank0.73Pass
FFT0.21Pass
NonOverlappingTemplate0.91Pass
OverlappingTemplate0.12Pass
ApproximateEntropy0.12Pass
Serial10.35Pass
Serial20.122Pass
LinearComplexity0.53Pass
Table 2. Comparison of the obtained encryption results with some recent studies.
Table 2. Comparison of the obtained encryption results with some recent studies.
CC
(Horizontal)
CC
(Vertical)
EntropyNPCRUACI
Proposed 0.0002−0.00077.997199.6628.64
Ref. [44]−0.00230.00287.997699.6233.28
Ref. [45]0.00620.00737.996599.6028.34
Ref. [46]0.00050.00257.999399.6032.48
Ref. [47]−0.01390.01777.999399.5833.43
Ref. [48]−0.0004−0.00047.999399.6033.45
Ref. [49]0.0058−0.00247.997599.6033.45
Ref. [50]0.0015−0.00217.997599.6033.45
Ref. [51]−0.00060.00087.999599.6033.46
Ref. [52]0.0002−0.00047.998599.6333.03
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Venkatesh, J.; Pchelintsev, A.N.; Karthikeyan, A.; Parastesh, F.; Jafari, S. A Fractional-Order Memristive Two-Neuron-Based Hopfield Neuron Network: Dynamical Analysis and Application for Image Encryption. Mathematics 2023, 11, 4470. https://doi.org/10.3390/math11214470

AMA Style

Venkatesh J, Pchelintsev AN, Karthikeyan A, Parastesh F, Jafari S. A Fractional-Order Memristive Two-Neuron-Based Hopfield Neuron Network: Dynamical Analysis and Application for Image Encryption. Mathematics. 2023; 11(21):4470. https://doi.org/10.3390/math11214470

Chicago/Turabian Style

Venkatesh, Jayaraman, Alexander N. Pchelintsev, Anitha Karthikeyan, Fatemeh Parastesh, and Sajad Jafari. 2023. "A Fractional-Order Memristive Two-Neuron-Based Hopfield Neuron Network: Dynamical Analysis and Application for Image Encryption" Mathematics 11, no. 21: 4470. https://doi.org/10.3390/math11214470

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop