Engineering Analysis/Linear Independence and Basis

From testwiki
Revision as of 04:46, 4 August 2017 by imported>Pi zero ({{BookCat}})
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Template:Engineering Analysis Template:Info

Linear Independence

A set of vectors V=v1,v2,,vn are said to be linearly dependent on one another if any vector v from the set can be constructed from a linear combination of the other vectors in the set. Given the following linear equation:

a1v1+a2v2++anvn=0

The set of vectors V is linearly independent only if all the a coefficients are zero. If we combine the v vectors together into a single row vector:

V^=[v1v2vn]

And we combine all the a coefficients into a single column vector:

a^=[a1a2an]T

We have the following linear equation:

V^a^=0

We can show that this equation can only be satisifed for a^=0, the matrix V^ must be invertable:

V^1V^a^=V^10
a^=0

Remember that for the matrix to be invertable, the determinate must be non-zero.

Non-Square Matrix V

If the matrix V^ is not square, then the determinate can not be taken, and therefore the matrix is not invertable. To solve this problem, we can premultiply by the transpose matrix:

V^TV^a^=0

And then the square matrix V^TV^ must be invertable:

(V^TV^)1V^TV^a^=0
a^=0

Rank

The rank of a matrix is the largest number of linearly independent rows or columns in the matrix.

To determine the Rank, typically the matrix is reduced to row-echelon form. From the reduced form, the number of non-zero rows, or the number of non-zero columns (whichever is smaller) is the rank of the matrix.

If we multiply two matrices A and B, and the result is C:

AB=C

Then the rank of C is the minimum value between the ranks A and B:

Rank(C)=min[Rank(A),Rank(B)]

Span

A Span of a set of vectors V is the set of all vectors that can be created by a linear combination of the vectors.

Basis

A basis is a set of linearly-independent vectors that span the entire vector space.

Basis Expansion

If we have a vector yV, and V has basis vectors v1v2vn, by definition, we can write y in terms of a linear combination of the basis vectors:

a1v1+a2v2++anvn=y

or

V^a^=y

If V^ is invertable, the answer is apparent, but if V^ is not invertable, then we can perform the following technique:

V^TV^a^=V^Ty
a^=(V^TV^)1V^Ty

And we call the quantity (V^TV^)1V^T the left-pseudoinverse of V^.

Change of Basis

Frequently, it is useful to change the basis vectors to a different set of vectors that span the set, but have different properties. If we have a space V, with basis vectors V^ and a vector in V called x, we can use the new basis vectors W^ to represent x:

x=i=0naivi=j=1nbjwj

or,

x=V^a^=W^b^

If V is invertable, then the solution to this problem is simple.

Grahm-Schmidt Orthogonalization

If we have a set of basis vectors that are not orthogonal, we can use a process known as orthogonalization to produce a new set of basis vectors for the same space that are orthogonal:

Given: V^=x1v2vn
Find the new basis W^=w1w2wn
Such that wi,wj=0i,j

We can define the vectors as follows:

  1. w1=v1
  2. wm=vmi=1m1vm,uiui,uiui

Notice that the vectors produced by this technique are orthogonal to each other, but they are not necessarily orthonormal. To make the w vectors orthonormal, you must divide each one by its norm:

w¯=ww

Reciprocal Basis

A Reciprocal basis is a special type of basis that is related to the original basis. The reciprocal basis W^ can be defined as:

W^=[V^T]1

Template:BookCat