Week 4 Notes
Week 4 Notes
Week 4 Notes
1) Linear Independence
1 1 1 1 1 1
Recall that we studied the set of column vectors of a matrix . The matrix = 2 2 2 2 2 2 with
3 3 3 3 3 3
column vectors not spanning
. The reason why it does not span is, there are too many redundant column
vectors. We call the column vectors linearly dependent.
Definition:
A set of vectors {
, … , } is called linearly dependent (l.d.) if there are numbers , … , , not all zeroes,
such that
+ ⋯ + =
+ +
=
1) If ≠ 0, we have
= ( +
)
2) If ≠ 0, we have = (
+
)
"
3) If ≠ 0, we have
= (
+ )
#
On the picture, the light blue plane is $%&'{(, }. The vector ) lies on the plane, so it can be reached by
moving in the directions of (, . So , {(, , )} are l.d.
Here is an example of a vector ) that cannot be reached by moving along the directions of (, :
The vector ) does not lie on $%&'{(, }. So {(, , )} are NOT l.d. As one would expect, they are called
linearly independent.
Definition:
+ ⋯ + =
Exercises:
1 1 1 1 1 1
1) {2 , 2 , 2 , 2 , 2 , 2} is l.d. More generally, any set with two equal vectors are l.d.
3 3 3 3 3 3
1 2
2) {2 , , 3} is l.d. More generally, any set with a zero vector are l.d.
3 5
1 1
3) {+ , , + ,} is l.i.
1 −1
1 1 3
4) {1 , 3 , 1} is l.i.
3 1 1
1 2 −1 1
5) $ = {2 , 3 , 6 , 3} is l.d. More generally, if Number of entries of each vector < Number of
3 4 9 9
vectors in 1, then 1 is always l.d.
Solution:
1 2
0 ⋅ 2 + 10 ⋅ + 0 ⋅ 3 =
3 5
Therefore it is l.d. by the definition above (with = 0, = 10, = 0). Indeed, in terms of pictures, is
always a redundant vector.
1 1 3 0
4) We want to check whether 1 + 3 + 1 = 0 has non-trivial solutions for , , or
3 1 1 0
1 1 3
not. So consider the coefficient matrix 1 3 1 and note that it has pivot on every column – so it only
3 1 1
has trivial solution from what we learnt about homogeneous eqns.
Theorem:
Let be an 3 × ' matrix. The following statements are equivalent, i.e. (&) ⇔ (6) ⇔ ( ) ⇔ (7)
To conclude, we have the following table checking whether the column vectors of a matrix spans or
linearly independent (we will come to the last column of the table at the end of this week’s notes)
Example:
1 4 7 10
Let = >2 5 8 11A.
3 6 9 12
Find the RREF of .
Do the columns of span
?
i)
(Why
? Does it make sense to ask the same question for B , C , … ?)
ii)
Solution:
(i)
1 4 7 10 1 4 7 10 1 4 7 10 1 0 −1 −2
>2 5 8 11A → >0 −3 −6 −9A → >0 1 2 3 A → >0 1 2 3 A
3 6 9 12 0 0 0 0 0 0 0 0 0 0 0 0
(ii)
According to the calculations above, the last row does not have a pivot. So the columns of does not span
.
(iii)
TO CHECK WHETHER THE COLUMNS OF ARE LINEARLY INDEPENDENT, THE THEOREM ON NOTES SAYS WE
ONLY NEED TO CHECK WHETHER
8 =
HAS A NONTRIVIAL SOLUTION. (Recall the equation above always have a trivial solution.)
In our case, it is the same as solving the system of equations with augmented matrix
1 4 7 10 | 0 1 0 −1 −2 | 0
>2 5 8 11 | 0A → >0 1 2 3 | 0A
3 6 9 12 | 0 0 0 0 0 | 0
3 9
8 4
>10A which is the same as moving two units along >5A.
12 6
Exercise:
1) Check the spanning and linear independence properties on the columns of the following matrices:
0 1 0 1 2
1 3 5 1 0 3
F G , F G , > 7 3A , > 7 3 7A
2 4 6 2 0 4
−3 6 −3 6 3
2) Let be a 4 × ' matrix such that its columns span B and are linearly independent.
what is the value of '?
what is the RREF of ?
i)
ii)
Answers:
0 1 0 1 2
1 3 5 1 0 3
1) F G , F G spans, not l.i. ; > 7 3A does not span, l.i. ; > 7 3 7A spans and l.i.
2 4 6 2 0 4
−3 6 −3 6 3
1 0 0 0
2) H = B, RREF of I = J0 1 0 0K
0 0 1 0
0 0 0 1
2) Matrix Transformations
Recall that we defined matrix-vector multiplication 8, where
8 ∈ H − (' × 1) RSNTO
8 ∈ U − (3 × 1) RSNTO
We offer another way to look at the multiplication - treating the process of multiplication as a function,
Input: 8 ∈ H ↦ Output: 8 ∈ U
9
• range is the set of all outputs of W, i.e. the set
9
{W (8)|8 ∈ ; } = {8|8 ∈ ; } = Y9 :
+ ⋯ + 9; :< Z [ ⋮ ] ∈ ; ^ = $%&'{:
, … , :< }
9;
Warning: In our case, W(8) = 8. It does not mean W = . As an analogy, _(9 ) = 29 and _ ≠ 2.
1 2 3
Exercise: Let = F G.
2 − 1 5
1
(ii)
Is F G in the range of W ?
−1
(iii)
Solution:
9
9 + 29 + 39
i) W:
→ is given by W 9 = a b
9 29 − 9 + 59
13 13
ii) 8 = 1 has image zero under the map W. More generally, all vectors in $%&' c 1 d =
−5 −5
13&
{ & | & ∈ e} has image .
−5&
&
iii) Yes, actually every element F G is in the range of W (i.e. W is surjective/onto)
6
3) Linear Transformations
In fact, the function W: ; → = defined above has some nice properties:
Any functions satisfying these properties are called linear transformations. And since the matrix
transformations we defined last section satisfy the two conditions above (check one of the exercises last
week)
Exercises:
W() = W(0 ⋅ ) = 0 ⋅ W () =
2) Check whether the following are linear transformations: If yes, show that the two conditions above hold;
if not, give a counter-example and check how either of the condition above fails.
9 9 9 9
|9 | 39 − 9
W +9 , = a b , W +9 , = j9 k , W +9 , = a b,
9 29 + 9
1
9 9
9 9 9 9 9
W +9 , = + 9 , , W +9 , = 9 , W 9 = +9 ,,
9 9
For example,
9 9
W +9 , = j9 k is not a linear transformation, since
1
0
0
W () = W + , = 0 ≠
0
1
9 |9 |
W +9 , = a b is not a linear transformation since
9
1 1 −1 1 0 0 1 −1
W+ , = + ,, W+ , = + ,, W + , = + , ≠ W + , + W + ,
0 0 0 0 0 0 0 0
9
9
9
W +9 , = is linear transformation since
9
9 9
9 9 9
9 9
W j ⋅ +9 ,k = W +9 , = = = ⋅ W +9 ,
9 9
9 + l 9 l
9 l 9 + l 9 l
W j+9 , + +l ,k = W +9 + l , = 9 + l = 9 + l = W +9 , + W +l ,
9 + l 9 l
Why linear transformations? The short answer is, you know almost everything about a linear
transformation with little information.
Suppose I tell you a function _: → with _(1) = 2, _(2) = 2. You can guess what _(9) is, e.g. _ (9) =
2, _(9 ) = 2 cos(2p9), … but you can’t tell exactly what _(9) is.
Example: Let W: →
satisfying
2 −1
1 0
W + , = 1 , W+ , = 0
0 1
3 4
& &
Then we know W + , for any + , ∈ , for example:
6 6
−5 −5 0 −5 0
W+ , = W F+ , + + ,G = W + , + W + ,
4 0 4 0 4
1 0
= −5W + , + 4W + ,
0 1
2 −1 −10 −4 −14
= −5 j1k + 4 0 = −5 + 0 = j −5 k
3 4 −15 16 1
More generally, if W: ; → = is a linear transformation, and {
, … , q } is a collection of vectors in ; .
If we know the values of W
, … , Wq , then we basically know the values of W(&
+ ⋯ + &r q ). Namely,
&
&
W(&
+ ⋯ + &r q ) = & (W
) + ⋯ + &r (Ws ) = LW
W … Wq M [ ⋮ ]
&r
Let W: ; → = be a linear transformation. To understand how W looks like, we only need to know how W
looks like for a finite collection of vectors. In particular, we make the following choice of this finite
collection:
1 0 0
0 1 0
t
= J K , t = J K , … , t< = J K
⋮ ⋮ ⋮
0 0 1
9
9
Then every 8 = [ ⋮ ] = 9 t
+ 9 t + ⋯ + 9; t< . If we know the values of Wt
, … , Wt< , then
9;
9
9
W8 = 9 Wt
+ … + 9; Wt< = L Wt
… Wt< M [ ⋮ ]
9;
Writing
≔ L Wt … Wt< M
W8 = 8
Let W: →
be a linear transformation satisfying
2 1
1 2
WF G = >1A , W F G = > 3A
−1 1
2 1
Find the standard matrix of W.
We need to find v + , and v + ,.
Solution:
(i) To find v + ,, note that
2 1 3
3 1 2
W + , = W + , + W + , = 1 + 3 = 4
0 −1 1
2 1 3
Therefore
1 3 1 3 1
1
W + , = W + , = 4 = 4/3
0 3 0 3
3 1
(ii) To find v + ,, note that
2 −4
−2 1
W + , = −2 W + , = −2 1 = −2
2 −1
2 −4
1
2
W + , = 3
1
1
−4 1 −3
0 −2 2
So W + , = W + , + W + , = −2 + 3 = 1 , and
3 2 1
−4 1 −3
−3 −1
0
W + , = 1/3 1 = 1/3
1
−3 −1
1 −1
Hence the standard matrix is = 4/3 1/3
1 −1
0 0 1
1 0 0
0 1 0
W 2 = [ ] , W 1 = [ ] , W 0 = [ ]
1 0 0
3 2 1
0 0 0
Solution: We need to find v , v
and v .
1 1 0 0
(i) To find v , note that 0 = 1 ⋅ 2 − 2 ⋅ 1 + 1 ⋅ 0
0 3 2 1
1 1 0 0
(if you do not know how I come up with the numbers 1, -2, 1 in the above formula, solve the vector equation
1 −2 1
−2 1 0
Hence the standard matrix is = [ ]
1 0 0
0 0 0
Example:
Find the standard matrix of the linear transformation e: → given by “rotation by x”.
Solution:
1 0
We need to find e + , and e + ,.
0 1
Check that
1 cos x
e+ , = + ,
0 sin x
p
cos( + x)
0 2 −sin x
e+ , = { p |=+ ,
1 sin( + x) cos x
2
cos x −sin x
So the standard matrix is + ,.
sin x cos x
5) Injectiveness and surjectiveness of linear transformations
Recall in Math 1013 that:
i) Injectivity of v:
W (( − ) = W( − W = } − } =
i.e. (( − ) = .
Suppose the homogeneous matrix equation 8 = only has a unique soution 8 = , then (( − ), as a
solution of the equation 8 = , must be equal to . Therefore ( = . This shows that
Recall
So
ii) Surjectivity of v:
In order that W is surjective, we need range of W is the whole U . Recall from the first page that
So