Week 4 Notes

Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

Week 4 Notes:

1) Linear Independence
1 1 1 1 1 1
Recall that we studied the set of column vectors of a matrix . The matrix  = 2 2 2 2 2 2 with
3 3 3 3 3 3
column vectors not spanning
. The reason why it does not span is, there are too many redundant column
vectors. We call the column vectors linearly dependent.

Definition:

A set of vectors { , … ,  } is called linearly dependent (l.d.) if there are numbers  , … ,  , not all zeroes,
such that

 + ⋯ +   = 

i.e. the vector equation has a non-zero solution.

Why is it called dependent? Suppose { ,  ,


} is l.d., i.e. there are some numbers  ,  ,  , not all zeros,
such that:

 +   + 
= 

1) If  ≠ 0, we have = (  + 
)



2) If  ≠ 0, we have  = ( + 
)

"

3) If  ≠ 0, we have
= ( +   )

#

In other words, it says that by moving in the directions  ,


only, we can reach the point :

On the picture, the light blue plane is $%&'{(, }. The vector ) lies on the plane, so it can be reached by
moving in the directions of (, . So , {(, , )} are l.d.

Here is an example of a vector ) that cannot be reached by moving along the directions of (, :
The vector ) does not lie on $%&'{(, }. So {(, , )} are NOT l.d. As one would expect, they are called
linearly independent.

Definition:

A set of vectors { , … ,  } is called linearly independent (l.i.) if the vector equation

 + ⋯ +   = 

ONLY has ONE UNIQUE solution  =  = ⋯ =  = 0.

Exercises:

Verify the following statements.

1 1 1 1 1 1
1) {2 , 2 , 2 , 2 , 2 , 2} is l.d. More generally, any set with two equal vectors are l.d.
3 3 3 3 3 3
1 2
2) {2 , , 3} is l.d. More generally, any set with a zero vector are l.d.
3 5
1 1
3) {+ , , + ,} is l.i.
1 −1
1 1 3
4) {1 , 3 , 1} is l.i.
3 1 1
1 2 −1 1
5) $ = {2 , 3 ,  6  , 3} is l.d. More generally, if Number of entries of each vector < Number of
3 4 9 9
vectors in 1, then 1 is always l.d.

Solution:

2) We will only cover a couple of statements above.

1 2
0 ⋅ 2 + 10 ⋅  + 0 ⋅ 3 = 
3 5
Therefore it is l.d. by the definition above (with  = 0,  = 10,  = 0). Indeed, in terms of pictures,  is
always a redundant vector.

1 1 3 0
4) We want to check whether  1 +  3 +  1 = 0 has non-trivial solutions for  ,  ,  or
3 1 1 0
1 1 3
not. So consider the coefficient matrix 1 3 1  and note that it has pivot on every column – so it only
3 1 1
has trivial solution from what we learnt about homogeneous eqns.

Theorem:

Let  be an 3 × ' matrix. The following statements are equivalent, i.e. (&) ⇔ (6) ⇔ ( ) ⇔ (7)

a) The equation 8 =  has only trivial solution.

b) The only solution of 9 : + ⋯ + 9; :< =  in = is 9 = 9 = ⋯ = 9; = 0

c) The columns of  are linearly independent.

d)  has a pivot position on every column.

To conclude, we have the following table checking whether the column vectors of a matrix  spans or
linearly independent (we will come to the last column of the table at the end of this week’s notes)
Example:

1 4 7 10
Let  = >2 5 8 11A.
3 6 9 12
Find the RREF of .
Do the columns of  span
?
i)

(Why
? Does it make sense to ask the same question for B , C , … ?)
ii)

iii) Are the columns of  linearly independent?

Solution:

(i)

1 4 7 10 1 4 7 10 1 4 7 10 1 0 −1 −2
>2 5 8 11A → >0 −3 −6 −9A → >0 1 2 3 A → >0 1 2 3 A
3 6 9 12 0 0 0 0 0 0 0 0 0 0 0 0
(ii)

TO CHECK WHETHER THE COLUMNS OF  SPANS


, THE THEOREM ON NOTES SAYS WE ONLY NEED TO
CHECK WHETHER EVERY ROW OF  HAS A PIVOT.

According to the calculations above, the last row does not have a pivot. So the columns of  does not span
.

(iii)

TO CHECK WHETHER THE COLUMNS OF  ARE LINEARLY INDEPENDENT, THE THEOREM ON NOTES SAYS WE
ONLY NEED TO CHECK WHETHER

8 = 
HAS A NONTRIVIAL SOLUTION. (Recall the equation above always have a trivial solution.)

In our case, it is the same as solving the system of equations with augmented matrix

1 4 7 10 | 0 1 0 −1 −2 | 0
>2 5 8 11 | 0A → >0 1 2 3 | 0A
3 6 9 12 | 0 0 0 0 0 | 0

the columns of  are linearly dependent.


Since there are two free columns (column 3 and 4, colored in blue), there are nontrivial solutions. Hence
In conclusion:

1) Columns of  does not span


, i.e. if I move along the directions
1 4 7 10
>2A , >5A , >8A , >11A
3 6 9 12
in , I cannot reach every single point in
.

2) Columns of  are linearly dependent, at least one of the 4 directions


1 4 7 10
>2A , >5A , >8A , >11A
3 6 9 12
1 7
in is ‘redundant’. For example, if I move one unit along >2A, one unit along >8A, I will reach

3 9
8 4
>10A which is the same as moving two units along >5A.
12 6

Exercise:

1) Check the spanning and linear independence properties on the columns of the following matrices:

0 1 0 1 2
1 3 5 1 0 3
F G , F G , > 7 3A , > 7 3 7A
2 4 6 2 0 4
−3 6 −3 6 3
2) Let  be a 4 × ' matrix such that its columns span B and are linearly independent.
what is the value of '?
what is the RREF of ?
i)
ii)

Answers:

0 1 0 1 2
1 3 5 1 0 3
1) F G , F G spans, not l.i. ; > 7 3A does not span, l.i. ; > 7 3 7A spans and l.i.
2 4 6 2 0 4
−3 6 −3 6 3
1 0 0 0
2) H = B, RREF of I = J0 1 0 0K
0 0 1 0
0 0 0 1
2) Matrix Transformations
Recall that we defined matrix-vector multiplication 8, where

 = L: : … :< M − (3 × ') 3&NOP9

8 ∈ H − (' × 1) RSNTO

8 ∈ U − (3 × 1) RSNTO

We offer another way to look at the multiplication - treating the process of multiplication as a function,

Input: 8 ∈ H ↦ Output: 8 ∈ U

We call this function W: ; → = . Its

domain is the set of all inputs, i.e. al n-vectors H


codomain is = , the set of all m-vectors.

9
• range is the set of all outputs of W, i.e. the set

9
{W (8)|8 ∈ ; } = {8|8 ∈ ; } = Y9 : + ⋯ + 9; :< Z [ ⋮ ] ∈ ; ^ = $%&'{: , … , :< }
9;

Warning: In our case, W(8) = 8. It does not mean W = . As an analogy, _(9 ) = 29 and _ ≠ 2.

1 2 3
Exercise: Let  = F G.
2 − 1 5

Describe the matrix transformation W.


What are those 8 whose images are ?
(i)

1
(ii)
Is F G in the range of W ?
−1
(iii)

Solution:
9
9 + 29 + 39
i) W:
→  is given by W 9  = a  b
9 29 − 9 + 59

13 13
ii) 8 =  1  has image zero under the map W. More generally, all vectors in $%&' c 1 d =
−5 −5
13&
{ &  | & ∈ e} has image .
−5&
&
iii) Yes, actually every element F G is in the range of W (i.e. W is surjective/onto)
6
3) Linear Transformations
In fact, the function W: ; → = defined above has some nice properties:

W(8 + g) = W(8) + W(g)


W( ⋅ 8) =  ⋅ W (8)

Any functions satisfying these properties are called linear transformations. And since the matrix
transformations we defined last section satisfy the two conditions above (check one of the exercises last
week)

Matrix transformations are linear transformations

Exercises:

Check that if W is a linear transformation, the following holds:

W() = , W(& 8 + & 8 + ⋯ + &h 8i ) = & W(8 ) + ⋯ + &h W(8i )

W() = W(0 ⋅ ) = 0 ⋅ W ( ) = 

W(& 8 + & 8 + ⋯ + &h 8i ) = W((& 8 + & 8 + ⋯ &h 8i ) + &h 8i )

= W(& 8 + & 8 + ⋯ &h 8i ) + W(&h 8i )

= W(& 8 + & 8 + ⋯ &h 8i ) + &h W(8i )

= W(& 8 + & 8 + ⋯ &h 8i ) + &h W(8i ) + &h W(8i )

= & W(8 ) + ⋯ + &h W(8i )

2) Check whether the following are linear transformations: If yes, show that the two conditions above hold;
if not, give a counter-example and check how either of the condition above fails.

9 9 9 9
|9 | 39 − 9
W +9 , = a  b , W +9 , = j9 k , W +9 , = a  b,
9 29  + 9
1
  
9 9
9 9 9 9 9
W +9 , = + 9 , , W +9 , = 9  , W 9  = +9 ,,
  
9 9 

NO, NO, YES

NO, YES, YES

For example,
9 9
W +9 , = j9 k is not a linear transformation, since
1


0
0
W () = W + , = 0 ≠ 
0
1
9 |9 |
W +9 , = a  b is not a linear transformation since
 9

1 1 −1 1 0 0 1 −1
W+ , = + ,, W+ , = + ,, W + , = + , ≠ W + , + W + ,
0 0 0 0 0 0 0 0
9
9
9
W +9 , =    is linear transformation since

9
9 9
9 9 9
9 9
W j ⋅ +9 ,k = W +9 , =    =     =  ⋅ W +9 ,
 
9 9 

9 + l 9 l
9 l 9 + l 9 l
W j+9 , + +l ,k = W +9 + l , = 9 + l  = 9  + l  = W +9 , + W +l ,
   
9 + l 9 l  

Why linear transformations? The short answer is, you know almost everything about a linear
transformation with little information.

Suppose I tell you a function _: → with _(1) = 2, _(2) = 2. You can guess what _(9) is, e.g. _ (9) =
2, _(9 ) = 2 cos(2p9), … but you can’t tell exactly what _(9) is.

Things are different for linear transformations.

Example: Let W:  →
satisfying

2 −1
1 0
W + , = 1 , W+ , =  0 
0 1
3 4
& &
Then we know W + , for any + , ∈  , for example:
6 6
−5 −5 0 −5 0
W+ , = W F+ , + + ,G = W + , + W + ,
4 0 4 0 4
1 0
= −5W + , + 4W + ,
0 1
2 −1 −10 −4 −14
= −5 j1k + 4  0  =  −5  +  0  = j −5 k
3 4 −15 16 1
More generally, if W: ; → = is a linear transformation, and { , … , q } is a collection of vectors in ; .

If we know the values of W , … , W q , then we basically know the values of W(& + ⋯ + &r q ). Namely,
&
&
W(& + ⋯ + &r q ) = & (W ) + ⋯ + &r (W s ) = LW W  … W q M [ ⋮ ]
&r

4) Relation of Matrix Transformations and Linear Transformations


We have seen that all matrix transformations are linear transformations. So one would ask the converse:

Question: Can linear transformations be realized as matrix transformations?

Answer: YES, as we show below:

Let W: ; → = be a linear transformation. To understand how W looks like, we only need to know how W
looks like for a finite collection of vectors. In particular, we make the following choice of this finite
collection:

Let t , t , … , t< be the following vectors in H :

1 0 0
0 1 0
t = J K , t = J K , … , t< = J K
⋮ ⋮ ⋮
0 0 1
9
9
Then every 8 = [ ⋮ ] = 9 t + 9 t + ⋯ + 9; t< . If we know the values of Wt , … , Wt< , then
9;
9
9
W8 = 9 Wt + … + 9; Wt< = L Wt … Wt< M [ ⋮ ]
9;

Writing

 ≔ L Wt … Wt< M

as an (3 × ')-matrix, which is called the standard matrix, we see that

W8 = 8

That means every linear transformation is a matrix transformation.


Example:

Let W:  →
be a linear transformation satisfying

2 1
1 2
WF G = >1A , W F G = > 3A
−1 1
2 1
Find the standard matrix of W.


We need to find v + , and v + ,.

Solution:


(i) To find v + ,, note that

2 1 3
3 1 2
W + , = W + , + W + , = 1 + 3 = 4
0 −1 1
2 1 3
Therefore

1 3 1 3 1
1
W + , = W + , = 4 = 4/3
0 3 0 3
3 1

(ii) To find v + ,, note that

2 −4
−2 1
W + , = −2 W + , = −2 1 = −2
2 −1
2 −4
1
2
W + , = 3
1
1
−4 1 −3
0 −2 2
So W + , = W + , + W + , = −2 + 3 =  1 , and
3 2 1
−4 1 −3
−3 −1
0
W + , = 1/3  1  = 1/3
1
−3 −1
1 −1
Hence the standard matrix is  = 4/3 1/3
1 −1

To check the answer is correct, note that


1 −1 1 −1 2
1 1 1
W + , =  + , = 4/3 1/3 + , = 1 ⋅ 4/3 − 1 1/3 = 1
−1 −1 −1
1 −1 1 −1 2
Example:

Find the standard matrix  corresponding to the linear transformation W:


→ B with

0 0 1
1 0 0
0 1 0
W 2 = [ ] , W 1 = [ ] , W 0 = [ ]
1 0 0
3 2 1
0 0 0
 
Solution: We need to find v  , v   and v .
 
1 1 0 0
(i) To find v , note that 0 = 1 ⋅ 2 − 2 ⋅ 1 + 1 ⋅ 0
 0 3 2 1

1 1 0 0
(if you do not know how I come up with the numbers 1, -2, 1 in the above formula, solve the vector equation

0 = & ⋅ 2 + 6 ⋅ 1 +  ⋅ 0 on your own.) Therefore,


0 3 2 1
0 0 1 1
1 1 0 0
0 1 0 −2
W 0 = 1 ⋅ W 2 − 2 ⋅ W 1 + 1 ⋅ W 0 = [ ] − 2 ⋅ [ ] + [ ] = [ ]
1 0 0 1
0 3 2 1
0 0 0 0
 0 0 0
(ii) To find v  , note that 1 = 1 ⋅ 1 − 2 ⋅ 0. Therefore,
 0 2 1
0 1 −2
0 0 0
1 0 1
W 1 = 1 ⋅ W 1 − 2 ⋅ W 0 = [ ] − 2 [ ] = [ ]
0 0 0
0 2 1
0 0 0
1
 0
0
(iii) To find v , it comes directly from the question that W 0 = [ ]
0
1
0

1 −2 1
−2 1 0
Hence the standard matrix is  = [ ]
1 0 0
0 0 0
Example:

Find the standard matrix of the linear transformation e:  →  given by “rotation by x”.

Solution:

1 0
We need to find e + , and e + ,.
0 1
Check that

1 cos x
e+ , = + ,
0 sin x
p
cos( + x)
0 2 −sin x
e+ , = { p |=+ ,
1 sin( + x) cos x
2
cos x −sin x
So the standard matrix is + ,.
sin x cos x
5) Injectiveness and surjectiveness of linear transformations
Recall in Math 1013 that:

A function _ is one-to-one (injective) if for _ (9) = _ (l), 9 = l

A function _ is onto (surjective) if range of _ is the whole codomain.

What does it mean if _ is a linear transformation?

i) Injectivity of v:

Let W is a linear trnasformation. Suppose W( = W = }. Then

W (( − ) = W( − W = } − } = 

i.e. (( − ) = .

Suppose the homogeneous matrix equation 8 =  only has a unique soution 8 = , then (( − ), as a
solution of the equation 8 = , must be equal to . Therefore ( = . This shows that

W is injective ⇔ the matrix equation 8 =  has unique solution

When does the matrix equation 8 =  have unique solution?

Recall

 = LWt … Wt< M , 8 = 9 Wt + ⋯ + 9; Wt<

So

W is injective ⇔ 8 =  has unique solution ⇔ {Wt , … , WtH } are l.i.

ii) Surjectivity of v:

In order that W is surjective, we need range of W is the whole U . Recall from the first page that

Range of W = $%&'{Wt , … , Wt< }

So

W is surjective ⇔ {Wt , … , Wt< } spans U

You might also like