Vector Space and Matrices
Last updated
Last updated
Element zero has to be part of the subspace
Linear combination (multiplication by scalar, addition of two elements) of any two elements belonging to the subspace also needs to belong to the subspace.
To span a complete dim space, we need atleast independent vectors of dimension. For example, to span complete , you need atleast independent st .
Let's say you have K vectors each of , such that K > n. But you use maximum have n vectors to span the whole such that there are no linearly dependent vectors. Hence when you have K vectors, some of them have to be linearly dependent on others. Hence there will be exactly, K-n linear dependences among the K vectors.
Say you have three vectors, st , in this there is a linear combiantion. But that linear combination is actually a vector, vector. So you can treat each linear dependence equation as actually a vector.
The span formed by the vectors which are the coefficients of vectors whose linear combination is zero.
i.e null space of given k vectors is basically the span of vectors made up by coefficients of linear dependence equations in those k vectors. One linear dependence given eq'n give one vector, hence if there are m dependent eq'n among those k vectors, the Null space will be span of those m vectors.
Essentially null space of a given k vectors is space of vectors which will always give zero when those k vectors are linearly expressed with coefficients of the vector. i.e let be a matrix formed by stack vectors as column of the Matrix . Then null space is the space such that any vector belonging to null space gives: . ie Null space is space of all vectors such that where is matrix formed by k vectors. Note that if the K vectors are linearly independent, then there won't be any non-trivial possible in that case, hence Null space will be just zero vector.
You can also see Null space as defining the relatoinship among the columns of the matrix. Null space defines are complete relations among the columns. i.e defines the linear dependence of columns.
Set of vectors that can express any other vector from the same vector space as a linear combination. There would be some minimum number of vectors needed in the spanning set in order to express the all vectors possible in their vector space. For ex: if you have one 2D vector, you can only express other vectors that lie in the same line as the one vector in the space. But if you would have 2 2D vectors which are not colinear then you can span all the vectors in the plane using their linear combination. That spanning set with a minimum number of vectors (those vectors would have to be linearly independent) would be called the basis.
The minimum of vectors needed in the spanning set in order to express all the vectors in that vector space (of course the vectors will have to be linearly independent) is called the dimension of that vector space.
A spanning set with the minimum number of vectors to express all other vectors in the space is called Basis. A Basis can express all vectors as linear combinations and do so uniquely. The number of vectors in the basis is the dimension of that vector space.
Null Space: Analysiszing null space of given vectors, helps us to know the dependence between the vectors, i.e if vectors are linear dependent or independent.
Columns Space: Column space is the span of the given vectors, it helps us analyse the span of vectors.
Null space helps as tell if there are infnitely many solutions of the eq'n of unique, and Column space can help tell if are unique solutions or none at all.
can easily see that: num of columns. Where R is columns space and N is null space.
The point of gaussian elimination is to find the relations among the given vectors easier to parse or more evident. It helps to know the span of vectors, or see if the vectors are linearly independed/dependent. Or to check if some vector is indeed in the span of given vectors, etc.
Let's consider matrix multiplication of the form . Where is some dimension matrix and is vector, now you can the multiplication basically gives as a vector which is in column space of , specifically linear combination of columns of where the coeffcients of combination is given by can be seens as doing something to the matrix to get some output vector, deciedes the proportion of columns of A in which they are combined to get the new vector.
Now let's talk about the matrix of the form , where both are matrices, now you can see the matrix multiplication as many where each is column of matrix . Hence all the columns of resultant matrix are essentially some combination of columns of matrix , where the proportion/coefficient of combination is decided by the columns of matrix .
Resultant matrix can be seen as linear combination of rows of right hand side matrix where coefficients are given by row by left hand side matrix.
only two matric whose multiplication is commutative are and . & vice versa i.e if multiplication of matrices is commutative, then they have to be each other inverse.
A matrix is only invertible if it's columns are linearly independent i.e rows are linearly independent. https://youtu.be/fNpPrRNq8DU?list=PLlXfTHzgMRUKXD88IdzS14F4NxAZudSmv
You can find matrix inverse by using gaussian elimination.
Columns becomes rows when tranposed. When transposing, it might become useful to use the row perspectinve.
Can be used to describe the symmetric metrices i.e .
is always a symmetric matrix. Can be easily seen that , just have interchnages rows and columns and hence the multipliationn for the resultant becomes symmetric.
Properties of .
You see as solving , as trying to find such that applying it on left hand side of i.e doing row operations to will get your matrix .
SImilar you can see it as, as trying to find such that applying it on right hand side of i.e doing columns operations to will get your matrix .
The determinant is essentially a way to find out if the columns of the matrix, ie the spanning vectors of the column space are linearly dependent or not. If the determinant of the matrix is zero, it essentially means that the columns are linearly dependent.
It is also a way of telling if the systems of eqns will have infinitely many solutions or be unique, if they have infinitely many solutions that means that determinant is zero.
Matrix with determinant zero is called singular matrix.
Determinant of upper or lower triangular matrix is given by multiplication of diagonal entries of the matrix.
There is a geometrical analogous of determinant as well. The determinant in 2D gives area of paralellogram given by the columns of matrix and in 3D gives volume.