Russia is the birthplace of many great mathematicians, and the work from quite a few of them have significant impact on state-of-the-art computer science and machine learning theory, e.g.:
- Ludwig O. Hesse (1811-1874) - Hessian Matrix
- Used in calculations of Logistic Regression (which can be used for binary classification), and feed-forward Neural Networks
- David Hilbert (1862-1943) - Hilbert Space
- E.g. a feature space for Radial Basis Function kernels in Support Vector Machine classifiers can be described with a Hilbert Space
- Andrey N. Kolmogorov (1903-1987) - Kolmogorov Complexity
- Used in algorithmic information theory, and also in theory behind evolutionary and genetic programming
- Andrei A. Markov (1856-1922) - Markov Models and Markov Chains
- Can be used e.g. for simulation (in games).
- Noteworthy later "spinn-offs": Hidden Markov Models (HMM) and Markov Chain Monte Carlo (MCMC).
- Andrei N. Tikhonov (1906-1993) - Tikhonov Regularization
- Tikhonov Regularization is roughly a templating language for classification and regression. The template variable is a loss function, e.g. if using a square loss function you get Ridge Regression (also known as Regularized Least Squares Regression or Shrinkage Regression), an epsilon-insensitive loss function gives Support Vector Machine Regression, and a hinge loss function gives Support Vector Machine Classification.
No comments:
Post a Comment