A.4. Matrices of Linear Systems - JulTob/Mathematics GitHub Wiki

Linear Matrices: Systems, Solutions, and Structure 🎯📊

What Are Linear Matrices?

A linear matrix refers to a matrix that arises from a linear system of equations. These matrices encapsulate relationships where variables interact linearly—meaning each term is either a constant or a constant times a variable, with no products of variables or powers.

For example, the system:

\begin{cases}
  2x + 3y = 5 \\
  4x -  y = 6
\end{cases}

Is written as a matrix equation:

A \cdot \vec{x} = \vec{b}

Where:

A = \begin{bmatrix} 2 & 3 \\ 4 & -1 \end{bmatrix}, \quad
\vec{x} = \begin{bmatrix} x \\ y \end{bmatrix}, \quad
\vec{b} = \begin{bmatrix} 5 \\ 6 \end{bmatrix}

This structure is fundamental in linear algebra and appears in countless applications across science, engineering, and data analysis.


Solving Linear Systems 📐🧠

Linear matrices form the core of systems that can be solved using methods like:

  • Gaussian elimination
  • Matrix inversion (if $\det A \neq 0$)
  • Cramer's Rule
  • LU decomposition
  • Iterative methods (for large sparse systems)

These solutions depend on the determinant and rank of the matrix:

  • If $\det A \neq 0$ and rank equals the number of variables, the system has a unique solution.
  • If $\det A = 0$, solutions may be infinite or nonexistent, depending on consistency.

Instead of writing a linear system like this:

\color{#ff887a}
\left\{\begin{matrix}
1 x_1 + 2x_2 = -1 \\
3 x_1 + 4x_2 = 0 \\
5 x_1 + 6x_2 = 5 
\end{matrix}\right.

we could write it like this using matrices:

\color{#fc9a6a}

\begin{pmatrix}
1 & 2 \\
3 & 4 \\
5 & 6
\end{pmatrix} 
\begin{pmatrix}
x_1 \\
x_2
\end{pmatrix} 
= 

\begin{pmatrix}
-1 \\
0 \\
5
\end{pmatrix} 

In general, the system of Equation

\color{#f2ad61}
\left\{\begin{matrix}
a_{11} x_1 + a_{12} x_2  + ... + a_{1n}= b_1 \\
a_{21} x_1 + a_{22} x_2  + ... + a_{2n}= b_2 \\
...   \\
a_{m1} x_1 + a_{m2} x_2  + ... + a_{mn}= b_m 
\end{matrix}\right.

Can be turned into the Matrix System

\color{#fc9a6a}
\begin{pmatrix}
a_{11} & a_{12} & … &  a_{1n} \\
a_{21} & a_{22} & … &  a_{2n} \\
︙ & ︙ & ⋱ &  ︙ \\
a_{m1} & a_{m2} & … &  a_{mn} 
\end{pmatrix} 
\begin{pmatrix}
x_1 \\
x_2 \\
︙ \\
x_n
\end{pmatrix} 
= 

\begin{pmatrix}
b_1 \\
b_2 \\
︙ \\
b_m
\end{pmatrix} 

Matrices and Linear Systems 🧮📐🧠

Matrices are a fundamental tool for organizing and solving systems of linear equations. They provide a structured, scalable, and elegant way to analyze how multiple equations relate to multiple variables.


Writing a System in Matrix Form 🔄

Take the system:

$$ \begin{cases} 2x + 3y = 8 \ -x + 4y = 1 \end{cases} $$

This can be written in matrix-vector form as: $(A\vec{x} = \vec{b})$ where:

$$ (A = \begin{bmatrix} 2 & 3 \ -1 & 4 \end{bmatrix}) \text{ is the coefficient matrix} $$ $$ (\vec{x} = \begin{bmatrix} x \ y \end{bmatrix}) \text{ is the unknown vector} $$ $$ (\vec{b} = \begin{bmatrix} 8 \ 1 \end{bmatrix}) \text{ is the constant vector} $$

This compact format allows us to apply linear algebra techniques to study the system.


Why Matrices Matter 🔍

Matrices help us:

  • Represent large systems clearly
  • Analyze structure using concepts like rank, nullity, and determinant
  • Develop efficient, scalable algorithms for solving

Matrix representation is especially useful when working with many variables and equations, or when developing general solution methods.


Solving Techniques 🔧

Several methods can be used to solve matrix equations $(A\vec{x} = \vec{b})$:

  • Gaussian Elimination: systematically reduces the system to row-echelon form
  • Inverse Matrix Method: if $(A^{-1})$ exists, then $(\vec{x} = A^{-1}\vec{b})$
  • LU or QR Decomposition: useful for numerical and large-scale solutions

Each approach highlights different properties of the matrix and solution behavior.


Geometric Perspective 🧭

Every equation in the system corresponds to a line (in 2D), a plane (in 3D), or a hyperplane (in higher dimensions). The solution is where these geometric objects intersect. The matrix $(A)$ defines how the input space is transformed.


Insights 🌍

The structure of the matrix $(A)$ reveals the nature of the system:

  • A unique solution: the system is consistent and independent
  • Infinitely many solutions: the system is consistent but dependent
  • No solution: the system is inconsistent

Linear systems and matrices are deeply connected through linear transformations. By studying $(A)$, we gain insight into the solution space and the system’s overall behavior across mathematics, science, and engineering.

Solving for Variables 🕵️‍♂️

To solve for variables like $x$, we substitute the relevant column with the constants ($C_1, C_2, C_3, C_4$) and compute the determinant. For example:

To solve for $x$:

x = \frac{
\begin{vmatrix}
C_1 & a_{12} & a_{13} & a_{14} \\
C_2 & a_{22} & a_{23} & a_{24} \\
C_3 & a_{32} & a_{33} & a_{34} \\
C_4 & a_{42} & a_{43} & a_{44}
\end{vmatrix}
}{
|A|
}

Pro Tip: If the determinants match (i.e., the determinant of the matrix with $C$'s is equal to the original matrix), the system is compatible and has solutions.

The System of Equations

Consider this system of linear equations (cue the dramatic music 🎶):

a_{11}x + a_{12}y + a_{13}z + a_{14}t = C_1  
a_{21}x + a_{22}y + a_{23}z + a_{24}t = C_2  
a_{31}x + a_{32}y + a_{33}z + a_{34}t = C_3  
a_{41}x + a_{42}y + a_{43}z + a_{44}t = C_4

Matrix Representation 💡

We can express this whole system in matrix form to make life easier (and cooler):

|A| = 
\begin{pmatrix}
a_{11} & a_{12} & a_{13} & a_{14} \\
a_{21} & a_{22} & a_{23} & a_{24} \\
a_{31} & a_{32} & a_{33} & a_{34} \\
a_{41} & a_{42} & a_{43} & a_{44}
\end{pmatrix}

As long as the determinant of $|A|$ isn’t zero, this system has a unique solution. 💡