Machine Learning 101 - spinningideas/resources GitHub Wiki

Basic Terms

Normalize

Normalize means to scale the data in a uniform way so it fits within a preferred range - usually between 0 and 1.

For example if you have a black and white image that you are trying to classify the only actual color is brightness (or a grayscale) and thus you may need to over-illuminate and make all images closer to white than they actually should be so that subsequent analysis is clearer.

Gradient

A gradient is an un-attached vector. It tells you where to move by how much. It just does not know what to move yet.

Loss Function

A loss function is a function which tells you how much the value you got by performing calculations on the input data differs from the expected value - the one you should have got. There are many types of loss functions.

Weights

Each neuron in each layer of a neural network has a weight for each of the connections to every neuron in the next layer it is connected to (not every neuron from the previous layer is necessarily connected to every neuron on the next layer). A layer tensor could either refer to a) the tensor containing all the weights of the layer or b) the layer's activation map, which is the previous layer's outputs multiplied by their respective weights. Both are described in a form of a tensor, so a matrix really.

Tensor

A tensor is a generalization of vectors and matrices and is easily understood as a multidimensional array.

Use Cases for Machine Learning Techniques

Models By Domain

Libraries for Machine Learning

https://github.com/spinningideas/resources/wiki/Machine-Learning-Libraries

Education and Learning Resources

https://github.com/spinningideas/resources/wiki/Machine-Learning---Learning-Path

Books

https://github.com/spinningideas/resources/wiki/Machine-Learning---Learning-Path#machine-learning---books

Learning Workbenches

Examples

Recommendation Engines

Tic Tac Toe

Data

PyTorch

Tensorflow

Hosting

Data Sets

Music

Papers