Guide Linear Algebra - UQ-Communication-Systems/public GitHub Wiki
A least squares estimate
Imagine we have two vectors [0,2,5,8] and [0.5,1.8,11,15.3] and we would like to map them to a linear equation (A so called line of best fit). Here are some methods to do this in Matlab and Python.
y = mx
In Python:
# Least Squares
import numpy as np
x = np.array([0,2,5,8])
y = np.array([0.5,1.8,11,15.3])
# Use the following model: y = mx
x_dash = x.reshape((len(x), 1))
m = np.linalg.lstsq(x_dash, y, rcond=None)[0]
print(m)
In Matlab:
% Least Squares
x = [0,2,5,8];
y = [0.5,1.8,11,15.3];
% Use the following model: y = mx
m = transpose(x) \ transpose(y)
Looking at the values, we expect a gradient of around 2, and this code gets us close to there.
y = mx + c
If we want to model a gradient and a non zero y intercept, we can adjust the model.
A = np.column_stack((x, np.ones(len(x))))
# In this case, we are solving the general linear equation Ax = b
m, c = np.linalg.lstsq(A, y, rcond=None)[0]
print(m, c)
A = [transpose(x), transpose(ones(size(x)))]
% In this case, we are solving the general linear equation Ax = b
m_c = A \ transpose(y)
Note, that the original x, y were almost linearly related, so we expect m to be close to 2
y = nx^2 + mx + c
We can also extend this to even greater polynomials as follows, by adjust the matrix A appropriately.
A = np.column_stack((x**2, x, np.ones(len(x))))
n, m, c = np.linalg.lstsq(A, y, rcond=None)[0]
print(n, m, c)
A = [transpose(x.^2), transpose(x), transpose(ones(size(x)))]
n_m_c = A \ transpose(y)
Again, the original x, y were almost linearly related, so we expect m to be close to 2, n to be close to 0, and c to be not that large.