# Entropy “2”-Soft Classification of Objects

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Statement of the Problem

#### 2.1. Learning

#### 2.2. Testing

## 3. Model Examples of “2”-Soft Classification

#### 3.1. Soft “2”-Classification of Four-Dimensional Objects

#### 3.1.1. Learning

#### 3.1.2. Testing

#### 3.2. Two-Dimensional Objects “2”-Soft Classification

#### 3.2.1. Learning

#### 3.2.2. Testing

## 4. Experimental Studies of “2”-Hard/Soft Classifications in Presence of Data Errors

#### 4.1. Data

#### 4.2. Randomized Model (Decision Rule)

#### 4.3. Testing of Learning Model: Implementation of “2”-Soft Classification

## 5. Conclusions

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Rosenblatt, M. The Perceptron—Perceiving and Recognizing Automaton; Report 85-460-1; 1957; Available online: http://blogs.umass.edu/brain-wars/files/2016/03/rosenblatt-1957.pdf (accessed on 19 April 2017).
- Tsipkin, Y.Z. Basic Theory of Learning Systems; Nauka (Science): Moscow, Russia, 1970. [Google Scholar]
- Ayzerman, M.A.; Braverman, E.M.; Rozonoer, L.I. A Potential Method of Machine Functions in Learning Theory; Nauka (Science): Moscow, Russia, 1970. [Google Scholar]
- Vapnik, V.N.; Chervonenkis, A.Y. A Theory of Pattern Recognition; Nauka (Science): Moscow, Russia, 1974. [Google Scholar]
- Vapnik, V.N.; Chervonenkis, A.Y. A Recovery of Dependencies for Empirical Data; Nauka (Science): Moscow, Russia, 1979. [Google Scholar]
- Bishop, C.M. Pattern Recognition and Machine Learning; Series: Information Theory and Statistics; Springer: New York, NY, USA, 2006. [Google Scholar]
- Hastie, T.; Tibshirani, R.; Friedman, J. The Elements of Statistical Learning: Data Mining, Inference, and Prediction; Springer, 2009. Available online: https://statweb.stanford.edu/tibs/ElemStatLearn/ (accessed on 19 April 2017).
- Merkov, A.B. Pattern Recognition. Building and Learning Probabilistic Models; M. LENAND: Moscow, Russia, 2014. [Google Scholar]
- Vorontsov, K.V. Mathematical Methods of Learning by Precedents; The Course of Lectures at MIPT. Moscow, Russia, 2006. Available online: http://www.machinelearning.ru/wiki/images/6/6d/Voron-ML-1.pdf (accessed on 19 April 2017).
- Zolotykh, N.Y. Machine Learning and Data Analysis. 2013. Available online: http://www.uic.unn.ru/zny/ml/ (accessed on 19 April 2017).
- Boucheron, S.; Bousquet, O.; Lugosi, G. Theory of Classification: A Survey of Some Recent Advances. ESAIM Probab. Stat.
**2005**, 9, 323–375. Available online: http://www.esaim-ps.org/articles/ps/pdf/2005/01/ps0420.pdf (accessed on 19 April 2017). [CrossRef] - Smola, A.; Bartlett, P.; Scholkopf, B.; Schuurmans, D. Advances in Large Margin Classifiers; MIT Press: Cambridge, MA, USA, 2000. [Google Scholar]
- Jain, A.; Murty, M.; Flunn, P. Data Clustering: A Review. ASM Comput. Surv.
**1999**, 31, 264–323. [Google Scholar] [CrossRef] - Brown, G. Ensemble learning. In Encyclopedia of Machine Learning; Sammut, C., Webb, G.I., Eds.; Springer: New York, NY, USA, 2010; pp. 312–320. [Google Scholar]
- Furnkranz, J.; Gamberger, D.; Lavrac, N. Foundations of Rule Learning; Springer: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
- Popkov, Y.S.; Dubnov, Y.A.; Popkov, A.Y. Randomized Machini Learning: Statement, Solution, Applications. In Proceedings of the IEEE International Conference on Intelligent Systems, Sofia, Bulgaria, 4–6 September 2016. [Google Scholar]
- Kamal, N.; John, L.; Andrew, M. Using Maximum Entropy for Text Classification. IJCAI-99 Workshop on Machine Learning for Information Filtering. Available online: http://www.cc.gatech.edu/isbell/reading/papers/maxenttext.pdf (accessed on 19 April 2017).
- Payton, L.; Fu, S.-W.; Wang, S.-S.; Lai, Y.-H.; Tsao, Y. Maximum Entropy Learning with Deep Belief Networks. Entropy
**2016**, 18, 251. [Google Scholar] - Amos, G.; George, G.; Judge, D.M. Maximum Entropy Econometrics: Robust Estimation with Limited Data; John Wiley and Sons Ltd.: Chichester, UK, 1996; p. 324. [Google Scholar]
- Japkowicz, N.; Shah, M. Evaluating Learning Algorithms: A Classification Perspective; Cambridge University Press: Cambridge, UK, 2011. [Google Scholar]
- Gerstner, W.; Kishler, W.M. Spiking Neuron Models: Single Neurons, Population, Plasticity; Cambridge University Press: Cambridge, UK, 2002. [Google Scholar]
- Rubinstein, R.Y.; Kroese, D.P. Simulation and Monte Carlo Method; John Willey and Sons: Hoboken, NJ, USA, 2008. [Google Scholar]

**Figure 2.**Two-dimensional section of probability density function (PDF) ${P}^{*}(\mathbf{a},{\overline{\theta}}^{*})$.

i | ${\mathit{e}}_{1}^{\left(\mathit{i}\right)}$ | ${\mathit{e}}_{2}^{\left(\mathit{i}\right)}$ | ${\mathit{e}}_{3}^{\left(\mathit{i}\right)}$ | ${\mathit{e}}_{4}^{\left(\mathit{i}\right)}$ |
---|---|---|---|---|

1 | 0.11 | 0.75 | 0.08 | 0.21 |

2 | 0.91 | 0.65 | 0.11 | 0.81 |

3 | 0.57 | 0.17 | 0.31 | 0.91 |

© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Popkov, Y.S.; Volkovich, Z.; Dubnov, Y.A.; Avros, R.; Ravve, E.
Entropy “2”-Soft Classification of Objects. *Entropy* **2017**, *19*, 178.
https://doi.org/10.3390/e19040178

**AMA Style**

Popkov YS, Volkovich Z, Dubnov YA, Avros R, Ravve E.
Entropy “2”-Soft Classification of Objects. *Entropy*. 2017; 19(4):178.
https://doi.org/10.3390/e19040178

**Chicago/Turabian Style**

Popkov, Yuri S., Zeev Volkovich, Yuri A. Dubnov, Renata Avros, and Elena Ravve.
2017. "Entropy “2”-Soft Classification of Objects" *Entropy* 19, no. 4: 178.
https://doi.org/10.3390/e19040178