Vector Space and Matrices

Linear Subspace

  • Element zero has to be part of the subspace

  • Linear combination (multiplication by scalar, addition of two elements) of any two elements belonging to the subspace also needs to belong to the subspace.

Spanning

To span a complete nndim space, we need atleast nn independent vectors of nn dimension. For example, to span complete Rn\mathbb{R}^n, you need atleast nnindependent xxst x∈Rnx \in \mathbb{R}^n.

Determining number of linear dependences

Let's say you have K vectors each of xix_i ∈Rn\in \mathbb{R}^n, such that K > n. But you use maximum have n vectors to span the whole Rn\mathbb{R}^nsuch that there are no linearly dependent vectors. Hence when you have K vectors, some of them have to be linearly dependent on others. Hence there will be exactly, K-n linear dependences among the K vectors.

Linear Combination are vectors!

Say you have three vectors, a⃗,b⃗,c⃗\vec{a},\vec{b},\vec{c} st a⃗+b⃗+c⃗=0⃗\vec{a}+\vec{b} +\vec{c}=\vec{0}, in this there is a linear combiantion. But that linear combination is actually a vector, 0⃗\vec{0}vector. So you can treat each linear dependence equation as actually a vector.

Null space

The span formed by the vectors which are the coefficients of vectors whose linear combination is zero.

i.e null space of given k vectors is basically the span of vectors made up by coefficients of linear dependence equations in those k vectors. One linear dependence given eq'n give one vector, hence if there are m dependent eq'n among those k vectors, the Null space will be span of those m vectors.

Essentially null space of a given k vectors is space of vectors which will always give zero when those k vectors are linearly expressed with coefficients of the vector. i.e let AA be a matrix formed by stack kkvectors as column of the Matrix AA. Then null space is the space such that any vector xx belonging to null space gives: Ax=0Ax=0. ie Null space is space of all vectors xx such that Ax=0Ax=0 where AAis matrix formed by k vectors. Note that if the K vectors are linearly independent, then there won't be any non-trivial xx possible in that case, hence Null space will be just zero vector.

You can also see Null space as defining the relatoinship among the columns of the matrix. Null space defines are complete relations among the columns. i.e defines the linear dependence of columns.

Spanning Set

Set of vectors that can express any other vector from the same vector space as a linear combination. There would be some minimum number of vectors needed in the spanning set in order to express the all vectors possible in their vector space. For ex: if you have one 2D vector, you can only express other vectors that lie in the same line as the one vector in the space. But if you would have 2 2D vectors which are not colinear then you can span all the vectors in the plane using their linear combination. That spanning set with a minimum number of vectors (those vectors would have to be linearly independent) would be called the basis.

Dimension of vector space

The minimum of vectors needed in the spanning set in order to express all the vectors in that vector space (of course the vectors will have to be linearly independent) is called the dimension of that vector space.

Basis

A spanning set with the minimum number of vectors to express all other vectors in the space is called Basis. A Basis can express all vectors as linear combinations and do so uniquely. The number of vectors in the basis is the dimension of that vector space.

Way to Look at Linear system of Eq'ns

Null space vs Column Space

Null Space: Analysiszing null space of given vectors, helps us to know the dependence between the vectors, i.e if vectors are linear dependent or independent.

Columns Space: Column space is the span of the given vectors, it helps us analyse the span of vectors.

Null space helps as tell if there are infnitely many solutions of the eq'n of unique, and Column space can help tell if are unique solutions or none at all.

can easily see that: dim(R)+dim(N)= \text{dim}(R)+\text{dim}(N) = num of columns. Where R is columns space and N is null space.

The point of gaussian elimination is to find the relations among the given vectors easier to parse or more evident. It helps to know the span of vectors, or see if the vectors are linearly independed/dependent. Or to check if some vector is indeed in the span of given vectors, etc.

Matrix Multiplication

Column Perspective (What we kind of been looking at until now)

Let's consider matrix multiplication of the form Ax Ax. Where AAis some dimension matrix and xxis vector, now you can the multiplication basically gives as a vector which is in column space of AA, specifically linear combination of columns of AA where the coeffcients of combination is given by x.x. Ax Ax can be seens as xx doing something to the AAmatrix to get some output vector, xxdeciedes the proportion of columns of A in which they are combined to get the new vector.

Now let's talk about the matrix of the form AXAX, where both A,XA, X are matrices, now you can see the matrix multiplication as many AxiAx_i where each xix_i is column of matrix XX. Hence all the columns of resultant matrix are essentially some combination of columns of matrix AA, where the proportion/coefficient of combination is decided by the columns of matrix XX.

Row perspective

Resultant matrix can be seen as linear combination of rows of right hand side matrix where coefficients are given by row by left hand side matrix.

Matrix Inverse

  • only two matric whose multiplication is commutative are AA and A−1A^{-1}. & vice versa i.e if multiplication of matrices is commutative, then they have to be each other inverse.

  • A matrix is only invertible if it's columns are linearly independent i.e rows are linearly independent. https://youtu.be/fNpPrRNq8DU?list=PLlXfTHzgMRUKXD88IdzS14F4NxAZudSmv

You can find matrix inverse by using gaussian elimination.

Transpose and Symmetric Matrices

Columns becomes rows when tranposed. When transposing, it might become useful to use the row perspectinve.

  • Can be used to describe the symmetric metrices i.e A=ATA = A^T.

  • AATAA^Tis always a symmetric matrix. Can be easily seen that ATA^T, just have interchnages rows and columns and hence the multipliationn for the resultant becomes symmetric.

x^TAy

Properties of xTAyx^TAy.

A=LU Decomposition

You see as solving A=LUA=LU, as trying to find L−1L^{-1}such that applying it on left hand side of AA i.e doing row operations to AA will get your matrix UU.

SImilar you can see it as, as trying to find U−1U^{-1}such that applying it on right hand side of AA i.e doing columns operations to AA will get your matrix LL.

Determinant

The determinant is essentially a way to find out if the columns of the matrix, ie the spanning vectors of the column space are linearly dependent or not. If the determinant of the matrix is zero, it essentially means that the columns are linearly dependent.

It is also a way of telling if the systems of eqns will have infinitely many solutions or be unique, if they have infinitely many solutions that means that determinant is zero.

There is a geometrical analogous of determinant as well. The determinant in 2D gives area of paralellogram given by the columns of matrix and in 3D gives volume.

Last updated