# The Effect of Learning Rate on Fractal Image Coding Using Artificial Neural Networks

## Abstract

**:**

## 1. Introduction

## 2. Backpropagation and IFS

## 3. Properties of IFSs

#### IFS Coding

## 4. Using Neural Networks to Code IFS

## 5. Activation Function

- Logistic function $\varphi \left(v\right)=\frac{1}{1+{e}^{-av}}$ where domain is $\left[0,1\right]$.
- Hyperbolic tangent $\varphi \left(v\right)=\frac{1-{e}^{-av}}{1+{e}^{-av}}$ and algebraic sigmoid function $\frac{v}{\sqrt{1+{v}^{2}}}$ where domain is $\left[-1,1\right]$.

#### 5.1. Learning Rate with Positive Coefficients of IFS

_{i}is the bias of i input neuron. An image is obtained at the end of this iterated operation for many points with random iterations. This image differs from the image we aimed to find in the system (IFS). Thus, the neural network weights must be updated to obtain an improved approximation of the target image.

#### 5.2. Learning Rate with Positive and Negative Coefficients of IFS

_{ijk}are the coefficients of IFS, i = 1, 2, …, n, where n is the number of IFS, j = 1, 2, and k = 1, 2, 3.

#### 5.3. Learning Rate with Coefficients of IFS b_{ij} > 1

_{ij}is positive and the same activation function for the second kind if bij is positive and negative with a small change for the two cases (Figure 3 and Figure 7).

## 6. Future Work

## Funding

## Acknowledgments

## Conflicts of Interest

## References

- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. In Proceedings of the Advances in Neural Information Processing Systems (NIPS), Lake Tahoe, NV, USA, 3–8 December 2012. [Google Scholar]
- Duchi, J.; Hazen, E.; Singer, Y. Adaptive Subgradient Methods for Online Learning and Stochastic Optimization. J. Mach. Learn. Res.
**2011**, 12, 2121–2159. [Google Scholar] - Kingma, D.; Ba, J. Adam: A Method for Stochastic Optimization. In Proceedings of the International Conference on Learning Representations (ICLR), San Diego, CA, USA, 7–9 May 2015. [Google Scholar]
- Daniel, C.; Taylor, J.; Nowozin, S. Learning Step Size Controllers for Robust Neural Network Training. In Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence (AAAI-16), Phoenix, AZ, USA, 12–17 February 2016. [Google Scholar]
- Kanada, Y. Optimizing neural-network learning rate by using a genetic algorithm with perepoch mutations. In Proceedings of the International Joint Conference on Neural Networks (IJCNN 2016), Vancouver, BC, Canada, 24–29 July 2016. [Google Scholar]
- Poole, B.; Lahiri, S.; Raghu, M.; Sohl-Dickstein, J.; Ganguli, S. Exponential expressivity in deep neural networks through transient chaos. In Proceedings of the Advances in Neural Information Processing Systems, Barcelona, Spain, 5–10 December 2016; pp. 3360–3368. [Google Scholar]
- Hinton, G.E.; Rumelhart, D.E.; Williams, R.J. Learning internal representations by error propagation. Parallel Distrib. Process. Explor. Microstruct. Cogn.
**1986**, 1, 318–362. [Google Scholar] - Melnik, O. Representation of Information in Neural Networks. Ph.D. Thesis, Brandies University, Waltham, MA, USA, 2000. [Google Scholar]
- Pollack, J.B. The induction of dynamical recognizers. Mach. Learn.
**1991**, 7, 227–252. [Google Scholar] [CrossRef] - Bertsekas, D.P.; Tsitsiklis, J.N. Neuro-Dynamic Programming. Algorithm and Computations; Athena Scientific: Belmont, CA, USA, 1996. [Google Scholar]
- Wu, J.; Braverman, V.; Yang, L. Obtaining Adjustable Regularization for Free via Iterate Averaging. In Proceedings of the International Conference on Machine Learning (PMLR), Virtual Event, 13–18 July 2020; pp. 10344–10354. [Google Scholar]
- Kushner, H.J.; Yin, G. Stochastic Approximation Algorithms and Applications; Springer: New York, NY, USA, 1997. [Google Scholar]
- Ali, A.H.; George, L.E.; Zaidan, A.A.; Mokhtar, M.R. High capacity, transparent and secure audio steganography model based on fractal coding and chaotic map in temporal domain. Multimed. Tools Appl.
**2018**, 77, 31487–31516. [Google Scholar] [CrossRef] - Siregar, S.P.; Wanto, A. Analysis of artificial neural network accuracy using backpropagation algorithm in predicting process (forecasting). Int. J. Inf. Syst. Technol.
**2017**, 1, 34–42. [Google Scholar] [CrossRef] [Green Version] - Leen, T.K.; Orr, G.B. Using curvature information for fast stochastic search. In Advances in Neural Information Processing Systems; Mozer, M.C., Jordan, M.I., Petsche, T., Eds.; The MIT Press: Cambridge, UK, 1997; pp. 606–612. [Google Scholar]
- Al-Jawfi, R.A. Solving the Inverse Problem of Fractals Using Neural Networks. Ph.D. Thesis, Baghdad University, Baghdad, Iraq, 2003. [Google Scholar]
- Fausett, L. Fundamentals of Neural Networks; Prentice-Hall: Upper Saddle River, NJ, USA, 1994. [Google Scholar]
- Giles, C.L.; Miller, C.B.; Chen, D.; Chen, H.H.; Sun, G.Z.; Lee, Y.C. Learning and extracting finite state automata with second-order recurrent neural networks. Neural Comput.
**1992**, 4, 393–405. [Google Scholar] [CrossRef]

**Figure 4.**Some last images of fractal with first kind for the neural network with different growth rates.

**Figure 5.**Some last images of fractals of the second kind for the neural network with different growth rates.

**Figure 6.**Some last images of fractals of the third kind of the neural network with different growth rates.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Al-Jawfi, R.A.
The Effect of Learning Rate on Fractal Image Coding Using Artificial Neural Networks. *Fractal Fract.* **2022**, *6*, 280.
https://doi.org/10.3390/fractalfract6050280

**AMA Style**

Al-Jawfi RA.
The Effect of Learning Rate on Fractal Image Coding Using Artificial Neural Networks. *Fractal and Fractional*. 2022; 6(5):280.
https://doi.org/10.3390/fractalfract6050280

**Chicago/Turabian Style**

Al-Jawfi, Rashad A.
2022. "The Effect of Learning Rate on Fractal Image Coding Using Artificial Neural Networks" *Fractal and Fractional* 6, no. 5: 280.
https://doi.org/10.3390/fractalfract6050280