Mathematics
Math Notes: Simple Linear Regression
Simple linear regression is the process of identifying the linear function (nonvertical straight line) which best describes the relationship between one independent and one dependent variable. In order to do this we need a way to assess the fit, the simplest way to do this is to measure the difference between the actual values and the predicted ones, however if we just take the raw measurements we might find the positives and negatives cancel each other out so we’re going to take the squares. Our aim here is to minimise the “loss” ($\mathcal{L}$) of the function.
Math Notes: Factorising Quadratics
This is one of those fundamental bits of math that I could not get my head round in school. There are three ways to do this. For the moment this only covers the simplest of the techniques.
Math Notes: A Short Intro To Differentiation
Differentiation is the process of computing the gradient (derivative) of an arbitrary function and can be used in many simple cases to minimise a function describing the loss (error) of our model.
Math Notes: Notes On Notation
Uppercase and lowercase Roman letters
Math Notes: The Types and Uses of Variables
If it can take on different values, it’s a variable. They come in a variety of flavours:
Math Notes: The Confusion Matrix
A confusion matrix is a simple way to visually present the accuracy of a classification algorithm. A confusion matrix can only be constructed for values which are already known so this analysis uses your labelled evaluation set rather than your unlabelled test set^{1}.

How best to divide up your data is covered in Model Selection and Evaluation ↩

Euclidean Distance
Euclidean Distance is the ‘ordinary’ straight line distance between two points in Euclidean Space.
The Cosine Rule and Dot Product
The cosign rule