### EderSantana/awesomeMLmath

Curated list to learn the math basics for machine learning

• Users starred: 213
• Users forked: 41
• Users watching: 213
• Updated at: 2020-05-27 07:06:40

# awesomeMLmath

Curated list to learn the math basics for machine learning. Note that this is a biased list from a Deep Learning researcher.

The main topics are Calculus, Linear Algebra, Statistics, Probability and Signal Processing. If you "combine" Probability with Signal Processing you have Stochastic Processes which is the theory behind RNN, Kalman Filters, etc.

### Linear Algebra

Linear Algebra MIT
Note that it is good to learn about eigenvalue decomposition, etc (DO NOT stop with just solving system of equations). You will use that to understand Principal Compoenent Analysis and the idea of space transformation which is what feature learning is all about.

### Statistics and probability

edx Introduction to Statistics
edx Probability
An exploration of Random Processes for Engineers this is an advanced course but one of my favorites.
Information Theory
Here is the deal, a probability density function (pdf) is as much as we can know about a radom variable. Machine Learning is about estimating "momements" (you should learn that) of a pdf. If your random variable is not Gaussian, you will need more than mean and variance to correctly describe it (mean and var are the 1st and 2nd order moments). Information Theory generalizes all that.

### Signal Processing

Signal processing will teach what are convolutions (for you convolutional neural nets). But no worries, signal processing is just linear algebra++. Ex. Fourier Transforms is an eigenvalue decomposition. All is one. If you trully learned linear algebra this part if mostly free.
edx Signals and Systems, part 1
edx Discrete Time Signal Processing
adaptive signal processing this is the basis of neural nets, but I couldn't find modern video material...

### How to study, the "degrees"

To get a practitioner badge, you should feel good if you can use Keras and XGboost for Kaggle competitions.

A developer level asks for the ability to write your own multilayer perceptron from scratch and contribute new layers to Keras. This means implementing models from papers in Theano or Tensorflow. You should need calculus and linear algebra to understand what you are doing. Maybe some statistics and probability to understand the expectation operators and cost functions.

To get your researcher degree you should be exploring new fields and writing new types of models. I can't tell you what to study to do that. Maybe nothing, maybe everything, if I knew it wouldn't be called research. But if you can understand the "An exploration of Random Processes for Engineers" you may already know most of the basic and intermediary math necessary.