Next Article in Journal
An Asymmetric Optical Cryptosystem Using Physically Unclonable Functions in the Fresnel Domain
Previous Article in Journal
Application of Adaptive Algorithms on Ultrasound Imaging
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Investigation of Memristor-Based Neural Networks on Pattern Recognition †

by
Gayatri Routhu
1,
Ngangbam Phalguni Singh
1,*,
Selvakumar Raja
2 and
Eppala Shashi Kumar Reddy
1
1
Department of Electronics and Communication Engineering, Koneru Lakshmaiah Education Foundation, KL Deemed to be University, Vijayawada 522302, India
2
Department of Electronics and Communication Engineering, Anna University, Chennai 600025, India
*
Author to whom correspondence should be addressed.
Presented at the International Conference on “Holography Meets Advanced Manufacturing”, Online, 20–22 February 2023.
Eng. Proc. 2023, 34(1), 9; https://doi.org/10.3390/HMAM2-14149
Published: 13 March 2023

Abstract

:
Mobile phones, laptops, computers, digital watches, and digital calculators are some of the most used products in our daily life. In the background, to make these gadgets work as per our desire, there are many simple components necessary for electronics to function, such as resistors, capacitors, and inductors, which are three basic circuit elements. The Memristor is one such component. This paper provides simulation results of the memristor circuit and its V-I characteristics at different functions as an input signal. A well-trained ANN is able to recognize images with higher precision. To enhance the properties such as accuracy, precision, and efficiency in recognition, memristor characteristics are introduced to the neural network, however, older devices experience some non-linearity issues, causing conductance-tuning problems. At the same time, to be used in some advanceable applications, ANN requires a huge amount of vector-matrix multiplication based on in-depth network expansion. An ionic floating gate (IFG) device with the characteristics of a memristive device can solve these problems. This work proposes a fully connected ANN using the IFG model, and the simulation results of the IFG model are given as synapses in deep learning. We use algorithms such as the gradient-descent model, forward and backward propagation for network building, and weight setting in neural networks to enhance their ability to recognize images. A well-trained network is formed by tuning those memristive devices to an optimized state. The synaptic memory obtained from the IFG device will be used in other deep neural networks to increase recognition accuracy. To be an activation function in the neural network, sigmoid functions were used but later replaced by the ReLu function to avoid vanishing gradients. This paper shows how images were recognized by their front, top, and side views.

1. Introduction

A memristor is a simple passive two-terminal structure first coined by Professor Leon Chua in 1971. It is the concatenation of a “Memory resistor”, which is a fourth fundamental circuit element besides resistor, capacitor, and inductor; this is the missing pair link that states the relationship between charge and flux over four basic circuit variables [1]. Depending on the input signal applied, there are two types of memristors: current-controlled and voltage-controlled [2]. The basic memristor is given in Figure 1.
Using Equations (1) and (2), we can find the value of memristance or memductance. The properties of the memristor made it useful for non-volatile memory and storage technology applications as it can theoretically develop multiple states. It can also operate at a very low voltage level. For the operation of a neural network as a synapse, these properties of memristor could be useful. Equations (1) and (2) below show the relationship between the flux and charge concerning voltage and current [1].
M ( q ) = d Ψ dq
M ( q ( t ) ) = d Ψ dt . d t d q = V ( t ) I ( t )
ANNs are interconnected groups of nodes that are composed of artificial neurons. An artificial neural network is a non-linear and self-adapting computational model that retains the biological concept of neural networks as human/animal brains [3]. An ANN has three layers, an input layer, a hidden layer, and the output layer. Every node of the input layer connects through links with every node of the next preceding layer, called a fully connected network [4]. It receives inputs, combines them, performs required computations based on a predefined activation function, and delivers the output. The devices with ANN using two terminals suffer from non-linearity and asymmetric conductance tuning problems. Due to this, their scalability was also affected. To solve these issues, an IFG (Ionic Floating Gate) model memristor-based device will be developed, resulting in a memristor-based neural network (MNN) from an ANN with memristor features. To implement this, a cadence Verilog A model will be used along with deep learning concepts. A memristor device has a simple structure with two terminals, its conductance can be modified by simple positive and negative pulses while representing synaptic weight [5]. These main functions of the memristor made it suitable for realizing the synaptic weight in an artificial neural network. By using a memristor device output as a synapse, a memristive neural network is expanded to be a multiple layer network with modified memristor-based backpropagation and gradient descent learning rules [6].

2. Memristor-Based Neural Networks

For the design of the MNN building, the network and weight loading both take a crucial role. Thereafter, the IFG model will act as a conductor. For the pattern recognition to be verified by the MNN, a dataset called MNIST is used at input neurons. Additionally, we use resistors to build a fully connected MNN. The optimization of weights is done by software [7]. The following Equation (3) is given for the output voltage.
Vout   = Rlimit . i = 1 Ψ V R i . 1 Rij
Here, Rij is the reciprocal of the conductance(G) which are obtained by Cadence Virtuoso, these are further used as weights Wij. To train and test the neural network, we need a massive dataset. Fortunately, the MNIST (modified national institute of standards and technology) database exists, which contains 60,000 training images and 10,000 testing images. In this dataset, picture pixels will be normalized by greyscale numbers divided from 0 to 255 values [8].

2.1. Memristor-Based IFG Model

This Ionic Floating Gate (IFG) memristive device has three terminals and is a combination of a redox transistor and a non-volatile CBM to make it a non-volatile synaptic memory [9]. A redox transistor was developed in memristor techniques to find a way to clear limitations such as low writing efficiency, vanishing gradients, and limited accuracy. It contains three layers. The first layer is PEDOT: which is made of polystyrene sulfonate i.e., PSS film followed by the Nafion layer. The last layer is the PEI (poly(ethylenimine)) layer, which is partially reduced by PEDOT: PSS film [10]. Connected to this is a conductive bridge memory that is designed with a layer of Ag in between Pt electrodes. The IFG model can be used as a memory storage device to memorize the last operation. Memory operates in two modes: read operation and write operation. Figure 2 and Figure 3 shows the internal view of the IFG memory [11].
By taking the above two models as a reference including some modifications in them, a compact IFG model characterized in Verilog-A code is developed, and importing this Verilog-A code file to cadence generates the IFG model [12]. The cadence model is shown in Figure 4. Here, we change the selector position to the electrolyte and middle of the gate terminal. These are suitable for the design of memristor-based synaptic circuits. The memristor IFG model conductance values are taken as synaptic arrays. For the recognition of patterns/images, we use a three-layer artificial network that is fully connected [13]. All weight connections hold their fixed conductance between the layers of the neural network, which is stored as IFG’s memristance “G” and mapped into the source-drain conductance of the synaptic array [14]. Here, the change in conductance is proportional to the flux, its gate voltage is greater than the threshold voltage, and tunes the gate to the source voltage [15]. The schematic circuit of IFG is shown in Figure 4, which is implemented in cadence.
The voltage applied at the gate node must be greater than the threshold voltage (Vth). The voltage level should be ±0.95 V which is given to the gate and source node.

2.2. Weight Setting through Gradient Descent Model and Backpropagation

Specific learning algorithms are useful to hasten the training of neural networks. Here, we adopt the gradient descent model and backpropagation along with memristor-based IFG circuit. It takes less than a few seconds to run the ANN [16]. A text-based “Spectra” is generated for the connections in the ANN circuit with the help of python. The input for this python code is the database from cadence simulation (Synapse) [17]. First, a normal-weighted synapse is given from output of the IFG model to the neural networks after getting the optimized values from the neural networks. Again, these optimized values should be applied to IFG and observe whether our activation function is optimized. Here, the weights are optimized by using gradient descent and backpropagation models. The differences are backpropagated to the neural networks. At epoch 0.68, the accuracy increases by 0.8% as compared to the traditional method. The circuit-level design and implementation of gradient and backpropagation learning architectures help to get optimized results. These results are compared with the original one. The below given Figure 5 shows the simple operation of ANN.
To achieve image pattern recognition, the optimized values are again given as weighted synapses in the neural networks along with the images as input. Here, ReLu and Adams activation functions are used. Wij are weighted neurons, and hij is the summation of input neurons and weights. The summation is fed to the activation function, which is shown in Figure 6.

3. Results and Analysis

To check the basic memristor characteristics observe Figure 7, the sine function is given as input harmonic with voltage (V) of 1 V amplitude and 1 Hz frequency (V(t) = V0sin(wt) for w = 2w0 angular frequency where the V-I characteristics shows with the applied negative voltage the current (I) increases, and the current (I) drops with the positive voltage value, causing a hysteresis loop to emerge. We can also see the linear relationship between the charge and flux.
In the result given Figure 8, for a positive pulse, the conductance values decrease for positive input voltage and increase with the negative input voltage i.e., similar to the characteristics of a memristor. These are the simulation results of the IFG model in cadence, for which the voltage value is ±0.95 V, and the threshold voltage is 0.4 V. The input voltage is tuned in between the gate and source, the threshold voltage (Vth) is applied to dc source i.e., connected in between drain and source.
Figure 9 below shows the post-simulation values for the optimized values from the IFG model.
After applying the optimized values to the IFG device, the same operation is repeated as it was in previous case but with increased accuracy by 0.8%. Here, the optimized parameters are X = 0.150 V, and Y = −0.612 V.
Here, Table 1 mention some values of basic memristor showing how the current varies with a change in input voltage and Table 2 and Table 3 mention values of IFG using Cadence Virtuoso, how conductance varies with a change in input voltage. The various parameters stated here for basic memristor are current (I), voltage (V), flux(Ψ), and charge (q). For the IFG model, the Gmax (conductance) values are given as a relationship between voltage and siemens.
By observing Table 4, the traditional model is compared with proposed model, in traditional model sigmoid function is used and in proposed model ReLu and Adams functions are used as activation function in neural network with optimized parameters causes the accuracy of recognition to increased up to 94.6%. So, with the proposed IFG model, the accuracy increases.

4. Conclusions

Artificial neural networks have a vital role in unsupervised deep learning models to implement applications based on pattern recognition and also in many areas of our daily lives. This paper provides us with the knowledge of developing an IFG device using cadence with the characteristics of a memristor-based circuit. To demonstrate the capability of an IFG device in pattern recognition, the values are optimized using the gradient descent model and the resulting optimized parameters are compared with the existing parameters. The testing ability of neural network increased by 0.8% when using the IFG model. Hence, the accuracy of the original network is 93.8%, which increases to 94.6%. Here, ReLu and Adams activation helps in faster optimization, and by this result, we can conclude that, by using IFG-based memristor characteristics in neural networks, one can increase image recognition accuracy.

Author Contributions

G.R.: Running experiments and drafting paper. N.P.S.: Supervision and proofreading. S.R.: Running experiments and analysis. E.S.K.R.: Running experiments and drafting paper. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data can be shared on request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Creswell. UC Santa Cruz UC Santa Cruz Electronic Theses and Dissertations Title. 2020. Available online: https://escholarship.org/uc/item/0jx2107r (accessed on 1 June 2020).
  2. Amelia, D. Application of Memristive Device Arrays for Pattern Recognition-Title. 2020. Available online: http://mpoc.org.my/malaysian-palm-oil-industry/ (accessed on 1 June 2020).
  3. Adhikari, S.P.; Kim, H.; Budhathoki, R.K.; Yang, C.; Chua, L.O. A circuit-based learning architecture for multilayer neural networks with memristor bridge synapses. IEEE Trans. Circuits Syst. I Regul. Pap. 2015, 62, 215–223. [Google Scholar] [CrossRef]
  4. Krestinskaya, O.; Salama, K.N.; James, A.P. Learning in memristive neural network architectures using analog backpropagation circuits. IEEE Trans. Circuits Syst. I Regul. Pap. 2019, 66, 719–732. [Google Scholar] [CrossRef]
  5. Krestinskaya, O.; Ibrayev, T.; James, A.P. Hierarchical Temporal Memory Features with Memristor Logic Circuits for Pattern Recognition. IEEE Trans. Comput. Des. Integr. Circuits Syst. 2018, 37, 1143–1156. [Google Scholar] [CrossRef]
  6. Chu, M.; Kim, B.; Park, S.; Hwang, H.; Jeon, M.; Lee, B.H.; Lee, B.G. Neuromorphic Hardware System for Visual Pattern Recognition with Memristor Array and CMOS Neuron. IEEE Trans. Ind. Electron. 2015, 62, 2410–2419. [Google Scholar] [CrossRef]
  7. Zhang, Y.; Li, Y.; Wang, X.; Friedman, E.G. Synaptic Characteristics of Ag/AgInSbTe/Ta-Based Memristor for Pattern Recognition Applications. IEEE Trans. Electron Devices 2017, 64, 1806–1811. [Google Scholar] [CrossRef]
  8. Yan, R.; Hong, Q.; Wang, C.; Sun, J.; Li, Y. Multilayer Memristive Neural Network Circuit Based on Online Learning for License Plate Detection. IEEE Trans. Comput. Des. Integr. Circuits Syst. 2022, 41, 3000–3011. [Google Scholar] [CrossRef]
  9. Abdoli, B.; Amirsoleimani, A.; Shamsi, J.; Mohammadi, K.; Ahmadi, A. A Novel CMOS-Memristor Based Inverter Circuit Design. In Proceedings of the 2014 22nd Iranian Conference on Electrical Engineering (ICEE), Tehran, Iran, 20–22 May 2014. [Google Scholar] [CrossRef]
  10. Azghadi, M.R.; Linares-Barranco, B.; Abbott, D.; Leong, P.H.W. A Hybrid CMOS-Memristor Neuromorphic Synapse. IEEE Trans. Biomed. Circuits Syst. 2017, 11, 434–445. [Google Scholar] [CrossRef] [PubMed]
  11. Querlioz, D.; Bichler, O.; Dollfus, P.; Gamrat, C. Immunity to device variations in a spiking neural network with memristive nanodevices. IEEE Trans. Nanotechnol. 2013, 12, 288–295. [Google Scholar] [CrossRef]
  12. Ran, H.; Wen, S.; Li, Q.; Yang, Y.; Shi, K.; Feng, Y.; Zhou, P.; Huang, T. Memristor-Based Edge Computing of Blaze Block for Image Recognition. IEEE Trans. Neural Netw. Learn. Syst. 2022, 33, 2121–2131. [Google Scholar] [CrossRef]
  13. Xu, X.; Xu, W.; Wei, B.; Hu, F. Memristor-based neural network circuit of delay and simultaneous conditioning. IEEE Access 2021, 9, 148933–148947. [Google Scholar] [CrossRef]
  14. Zhang, Y.; Wang, X.; Li, Y.; Friedman, E.G. Memristive Model for Synaptic Circuits. IEEE Trans. Circuits Syst. II Express Briefs 2017, 64, 767–771. [Google Scholar] [CrossRef]
  15. Duan, Q.; Jing, Z.; Zou, X.; Wang, Y.; Yang, K.; Zhang, T.; Wu, S.; Huang, R.; Yang, Y. Spiking neurons with spatiotemporal dynamics and gain modulation for monolithically integrated memristive neural networks. Nat. Commun. 2020, 11, 3399. [Google Scholar] [CrossRef]
  16. Sacchetto, D.; Gaillardon, P.E.; Zervas, M.; Carrara, S.; De Micheli, G.; Leblebici, Y. Applications of multi-terminal memristive devices: A review. IEEE Circuits Syst. Mag. 2013, 13, 23–41. [Google Scholar] [CrossRef]
  17. Truong, S.N. Single Crossbar Array of Memristors with Bipolar Inputs for Neuromorphic Image Recognition. IEEE Access 2020, 8, 69327–69332. [Google Scholar] [CrossRef]
Figure 1. (a) The link between the voltage and current is shown by a resistor, voltage, and charge by a capacitor, and current and flux by an inductor; (b) Memristor basic symbol.
Figure 1. (a) The link between the voltage and current is shown by a resistor, voltage, and charge by a capacitor, and current and flux by an inductor; (b) Memristor basic symbol.
Engproc 34 00009 g001
Figure 2. A polymer-based redox transistor.
Figure 2. A polymer-based redox transistor.
Engproc 34 00009 g002
Figure 3. Complete IFG model showing connections between redox transistor and CBM. (a) Write operation; (b) read operation.
Figure 3. Complete IFG model showing connections between redox transistor and CBM. (a) Write operation; (b) read operation.
Engproc 34 00009 g003
Figure 4. IFG-model-equivalent circuit. In the IFG model, the voltage applied at the “Gate” terminal is “Vw”, which is the write voltage, and at the “Drain” terminal is “Vr” which is the read voltage.
Figure 4. IFG-model-equivalent circuit. In the IFG model, the voltage applied at the “Gate” terminal is “Vw”, which is the write voltage, and at the “Drain” terminal is “Vr” which is the read voltage.
Engproc 34 00009 g004
Figure 5. A simple circuit affirmation from the input layer to the output layer.
Figure 5. A simple circuit affirmation from the input layer to the output layer.
Engproc 34 00009 g005
Figure 6. Operation of neural networks with IFG memory synapse and implementing image processing using gradient descent algorithm and backpropagation.
Figure 6. Operation of neural networks with IFG memory synapse and implementing image processing using gradient descent algorithm and backpropagation.
Engproc 34 00009 g006
Figure 7. MATLAB simulation results of basic Memristor. (a) Input sinewave signal (b) V-Ī characteristics of memristor, (c) linear relation between charge and flux (d) pinched hysteresis loop (non-linear characteristics).
Figure 7. MATLAB simulation results of basic Memristor. (a) Input sinewave signal (b) V-Ī characteristics of memristor, (c) linear relation between charge and flux (d) pinched hysteresis loop (non-linear characteristics).
Engproc 34 00009 g007
Figure 8. Two sets of graphs, one with the positive pulse response, and another one with the negative pulse response.
Figure 8. Two sets of graphs, one with the positive pulse response, and another one with the negative pulse response.
Engproc 34 00009 g008
Figure 9. Post-simulation results for optimized parameters.
Figure 9. Post-simulation results for optimized parameters.
Engproc 34 00009 g009
Table 1. Parametric values of the current, voltage, flux and charge for hysteresis curve with reference to Figure 1.
Table 1. Parametric values of the current, voltage, flux and charge for hysteresis curve with reference to Figure 1.
Measurement #Input Voltage (V)Input Current (I)Flux (Ψ)Charge (q)
1.0.018 mV0.022 mA3.3 × 10−9 (Wb)−2.8 × 10−9 C
2.0.037 mV0.044 mA1.3 × 10−4 (Wb)−1.1 × 10−8 C
3.0.056 mV0.066 mA1.5 × 10−4 (Wb)−2.5 × 10−8 C
4.0.075 mV0.088 mA1.8 × 10−4 (Wb)−4.5 × 10−8 C
Table 2. Parametric value analysis when the input pulse response = (+0.95 V).
Table 2. Parametric value analysis when the input pulse response = (+0.95 V).
Measurement #Input Voltage (V)Siemens (ns)
Gmax = 1 × 107(X) Gmax = 1 × 107(Y)
1.0.95 V01.0 × 10−7
2.0.60 V2.0 × 10−81.0 × 10−7
3.0.31 V3.0 × 10−71.0 × 10−7
4.0.19 V4.02 × 10−69.9 × 10−8
5.06.02 × 10−69.9 × 10−8
Table 3. Parametric value analysis when the input pulse response = (−0.95 V).
Table 3. Parametric value analysis when the input pulse response = (−0.95 V).
Measurement #Input Voltage (V)Siemens (ns)
Gmax = 1 × 10−7(X) Gmax=1 × 10−7(Y)
1.001.0 × 10−7
2.−0.19 V2.0 × 10−81.0 × 10−7
3.−0.31 V3.0 × 10−71.0 × 10−7
4.−0.60 V4.0 × 10−61.0 × 10−7
5.−0.95 V6.0 × 10−61.0 × 10−7
Table 4. Comparison between analytical values and optimized values.
Table 4. Comparison between analytical values and optimized values.
Model #Activation FunctionAccuracy
Traditional modelSigmoid function93.8%
Proposed modelReLu function and Adams function94.6%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Routhu, G.; Phalguni Singh, N.; Raja, S.; Reddy, E.S.K. Investigation of Memristor-Based Neural Networks on Pattern Recognition. Eng. Proc. 2023, 34, 9. https://doi.org/10.3390/HMAM2-14149

AMA Style

Routhu G, Phalguni Singh N, Raja S, Reddy ESK. Investigation of Memristor-Based Neural Networks on Pattern Recognition. Engineering Proceedings. 2023; 34(1):9. https://doi.org/10.3390/HMAM2-14149

Chicago/Turabian Style

Routhu, Gayatri, Ngangbam Phalguni Singh, Selvakumar Raja, and Eppala Shashi Kumar Reddy. 2023. "Investigation of Memristor-Based Neural Networks on Pattern Recognition" Engineering Proceedings 34, no. 1: 9. https://doi.org/10.3390/HMAM2-14149

Article Metrics

Back to TopTop