43. Vector Spaces - JulTob/Mathematics GitHub Wiki
Vector Spaces and Their Models in Type Theory
β: The Simplest Vector Space
The real numbers β form a vector space over themselves. Addition and scalar multiplication are standard operations.
The sum of two real numbers is an operation that associates to each pair of real numbers $a$ and $b$ another real number, denoted by $a + b$.
\color{gold}
+β : \mathbb{R} \times \mathbb{R} \rightarrow \mathbb{R}, \quad (a, b) \mapsto a + b
\color{gold}
[β] + [β]= [β]
The sum of real numbers is:
- commutative:
- $a+b=b+a$ for each $a,bββ$;
- associative:
- $(a+b)+c=a+(b+c)$ for each $a,b,cββ$
- admits a neutral element
- there exists a number, $0$, such that $0+a = a+0 = a$ for every $a β β$;
- every real number a admits an opposite
- there exists a number, which we denote by $βa$ or $Β¬a$, such that:
- $a + (Β¬a) = 0 = a + a^{\neg}$.
- there exists a number, which we denote by $βa$ or $Β¬a$, such that:
The product of two real numbers is an operation that associates to each pair of real numbers $a$ and $b$ another real number, denoted by $ab$. Therefore, the product is a function whose domain is $β Γ β$ and codomain is $β$:
\color{gold}
Γ β : \mathbb{R} \times \mathbb{R} \rightarrow \mathbb{R}, \quad (a, b) \mapsto ab
\color{gold}
[β] Γ [β]= [β]
The product of real numbers is:
- commutative:
- $ab=ba$ for each $a,bββ$;
- associative:
- $(ab)c=a(bc)$ for each $a,b,cββ$
- admits a neutral element
- there exists a number, $1$, such that $1a = a1 = a$ for every $a β β$;
- Distributive with respect to the sum
- $a(b+c) = ab + ac$ for every $a,b,cβ β$
βΒ²: Ordered Pairs
%%{init: {"quadrantChart": {"pointRadius": 1, "pointTextPadding": 13, "titlePadding": 20}}}%%
quadrantChart
title Vector space in 2D
x-axis x
y-axis y
A: [0.3, 0.6]
B: [0.45, 0.23]
C: [0.57, 0.99]
D: [0.78, 0.34]
E: [0.40, 0.34]
F: [0.35, 0.78]
We denote by the symbol βΒ² the set of ordered pairs of real numbers:
βΒ² = ο½οΌx,yοΌ : x,y β βο½
The fact that the pairs are ordered means, for example, that the element $(1, 2)$ is different from the element $(2, 1)$.
Once we fix a Cartesian coordinate system in a plane, there is a correspondence between βΒ² and the set of points in the plane.
Attaching a Cartesian reference to the plane means fixing two oriented perpendicular lines r and s and a unit of measure. The point of intersection between the two straight lines is called the origin of the reference system.
Each point of the plane is then uniquely identified by a pair of real numbers, called coordinates of the point, which indicate the distance of the point from the line s and its distance from the line r, respectively.
The student who is not familiar with the Cartesian plane can think of the boardgame Battleship.
- type $βΒ² := β Γ β$
- zero : $βΒ²$ := $(0, 0)$
-
Sum:
-
$+ βΆ βΒ² Γ βΒ² β βΒ²$
- $[βΒ²]+[βΒ²]=[βΒ²]
- $Ξ»((x,y),(xβ²,yβ²)) β¦ (x,y)+(xβ²,yβ²)=(x+xβ²,y+yβ²)$
- Commutative
- Associative
- Neutral $(0,0)$
- Opposite $(οΏ’x,οΏ’y)$
-
-
Multiplication by a real number (scalar):
-
$β β βΆ βΓβΒ²β βΒ²$
- $[β]Β·[βΒ²] = [βΒ²]$
- $Ξ»(r, (x, y)) β¦ r(x, y) = (rx, ry)$
- Distributive
- $(π+π)(π₯,π¦)=π(π₯,π¦) + π(π₯,π¦)$
- $(ππ)(π₯,π¦) = π(π(π₯,π¦))$
- $1(π₯,π¦)=(π₯,π¦)$
-
ββΏ: General Tuples
We can generalize to any size of tuples of real numbers: n-tuples.
(x_1,...,x_n) + (x'_1,...,x'_n) = (x_1 + x'_1 ,..., x_n+x'_n)
π(x_1,...,x_n) = (πΒ·x_1,...,πΒ·x_n)
\begin{matrix}
type & ββΏ & : & β¨^n β & := & β¨x_iβ© \\
const & zero & : & \vec{π} & : & ββΏ & := & β¨0_iβ© \\
function & add & is & (a:ββΏ, b:ββΏ) & : & ββΏ & := & β¨a_i+b_iβ© \\
function & scale & is & (r:β, v:ββΏ) & : & ββΏ & := & rβ¨v_iβ© := β¨rΒ·v_iβ© \\
\end{matrix}
Square Matrices
π _2(β) = \begin{Bmatrix}
\begin{pmatrix}
a & b \\
c & d
\end{pmatrix} | a,b,c,d
ββ
\end{Bmatrix}
Matrix operations are defined elementwise.
We define the sum as
-
$+ βΆ π _2(β) Γ π _2(β) β π _2(β)$
- [π _2(β)] + [π _2(β)] = [π _2(β)]
\begin{pmatrix}
a & b \\
c & d
\end{pmatrix} + \begin{pmatrix}
a' & b' \\
c' & d'
\end{pmatrix} = \begin{pmatrix}
a+a' & b+b' \\
c+c' & d+d'
\end{pmatrix}
-
$Β· βΆ β Γ π _2(β) β π _2(β)$
- [π _2(β)] Γ [π _2(β)] = [π _2(β)]
πΌΒ· \begin{pmatrix}
a & b \\
c & d
\end{pmatrix} = \begin{pmatrix}
πΌΒ·a & πΌΒ·b \\
πΌΒ·c & πΌΒ·d
\end{pmatrix}
\begin{matrix}
type & π_{2β¨―2} & : & (β Γ β) Γ (β Γ β) & := & \begin{pmatrix}
a & b \\
c & d
\end{pmatrix} \\
const & zero & : & π_{2β¨―2} & := & [0_{ij}] \\
function & add & is & (a:π_{2β¨―2}, b:π_{2β¨―2}) & : & π_{2β¨―2} & := & [a_{ij}+b_{ij}] \\
function & scale & is & (r:β, m:π_{2β¨―2}) & : & π_{2β¨―2} & := & [rΒ·m_{ij}] \\
\end{matrix}
Vector Spaces
A real vector space is a set (or type) $V$ equipped with two operations called sum and scalar multiplication, being the scalar of type $π$.
\begin{matrix}
+: Vβ¨―VβΆV & & & Β·:πβ¨―V βΆ V \\
+(π,π) βΌ π+π & & & Β·(πΌ,π) βΌ πΌπ \\
\end{matrix}
satisfying the following properties:
- For all $π,π,πβV$ and $πΌ,π½βπ$
- Commutative
- $π+π = π+π$
- Associative
- $(π+π)+π=π+(π+π)$
- Neutral Element of the sum
- $π+π = π$
- $πβV$
- Opposite
- $π+πΜ = 0$
- Scalar unity
- $1π=π$
- $(πΌπ½)π = πΌ(π½π)$
- $πΌ(π+π) = πΌπ+πΌπ$
- $(πΌ+π½)π = πΌπ+π½π$
The elements of a vector space are called vectors, while the π-set are called scalars. The neutral element of the sum in V is called the zero vector. To distinguish vectors from numbers we will indicate the vectors in bold.
Things that can be vector spaces
- Functions that follow
- $(f+g)(x) = f(x) + g(x)$
- $(πΌf)(x) = πΌf(x)$
- Polynomials
- Consider the set $β[x]$ of polynomials with real coefficients in the variable $x$.
Non-essential properties
We can also derive these extra properties.
- The zero vector is unique, denoted $π_V$
- For every vector $π$ there exists only one opposite $οΏ’π$ such as $π+πΜ = π_V$
- $πΌΒ·π_V = π_V$ for all πΌ
- $0Β·π = π_V$ for any π
- $-πΌΒ·π = πΌΒ·πΜ = οΏ’(πΌΒ·π)
The trivial vector space, denoted with $π_V$, is a vector space consisting only of the zero vector.
- $π_V + π_V = π_V$
- $πΌΒ·π_V = π_V$ for all πΌ
A real vector space can never be empty. It must contain the zero vector. It could happen it only had the zero vector. We call those trivial.
A vector space is non-trivial if it has at least one vector other than the zero vector. As we can multiply by real numbers as scalars then it should contain an infinite amount of elements, all the multiples of that vector $π$. This is all the vectors of the form
ππ β β βπββ
Therefore, a line that goes through the origin is a vector space.
---
title: "Vector Space"
config:
radar:
axisScaleFactor: 1
curveTension: 0
theme: dark
themeVariables:
cScale0: "#FF0000"
cScale1: "#00FF00"
cScale2: "#FFFF00"
radar:
curveOpacity: 0.075
graticuleColor: cyan
graticuleOpacity: 0.05
---
radar-beta
axis a["πΈ"], b["πΉ"], c["β"]
axis d["π»"], e["πΌ"], f["π½"]
curve v1["V1"]{3, 3, 0, 2, 5, 4}
curve v2["V2"]{7, 5, 4, 1.5, 2, 1}
curve v3["V1+V2"]{10, 8, 4, 3.5, 7, 5}
max 10
min 0
ticks 10
graticule polygon
Examples
ββΏ
: vectors of dimension nM_n(β)
: nΓn matricesβ[x]
: polynomials with real coefficients(X β β)
: real-valued functions
A non-trivial space contains some v β zero
, and by scalar multiplication, all multiples aΒ·v
exist.
A one-dimensional vector space corresponds to a line through the origin