# Linear Algebra II

## October 23, 2014

### Lecture 17

Filed under: 2014 Fall — Y.K. Lau @ 8:25 PM

We learnt how to represent a linear transformation ${T:V\rightarrow W}$ by a matrix. Remember that we need to fix an ordered basis ${B}$ for ${V}$ and an ordered basis for ${W}$ in order to represent ${T}$ by a matrix. Quite reasonably the matrix representing ${T}$ depends on the choice of ordered bases ${B}$ and ${D}$. Then you may wonder, “What are the relations between the matrix representations for different pairs of ordered bases?” If you have this question, that’s good and please come next lecture. We shall answer this question then.

There are many important results covered today:

• Thm 9.2.2 explains explicitly how to represent a linear transformation by a matrix. Its proof is not difficult. Remember the diagram helps a lot to read the result and to get ideas of the proof. Make sure you know how to read the diagram.

• Thm 9.2.3 is a natural and useful result, telling us how to find the matrix representation of the composite of two linear transformations.

• Thm 9.2.4 says that an isomorphism is characterized by the nonsingular property of its matrix representation. There are two interesting points underlying the result:
1. No matter which pair of basis you choose, the matrix representation of an isomorphism must always be invertible/nonsingular.
2. If we find a pair of basis with respect to which the matrix representation of a linear transformation ${T}$ is nonsingular, then so are all other matrix representations of ${T}$. (Just from the definition, these two results are not obvious.)

We do not follow entirely the textbook in the proof of Thm 9.2.4. Our argument is somewhat more lengthy but helps recap some important ideas. For reference, we repeat part of the argument below.

Let ${A=M_{DB}(T)}$. Given that ${A}$ is invertible. Then ${A^{-1}\in M_{n,n}}$ exists, and ${A^{-1}}$ induces a linear tranformation

${T_{A^{-1}}: {\mathbb R}^n\rightarrow {\mathbb R}^n}$, ${T_{A^{-1}} (x) = A^{-1} x}$   where   ${x\in {\mathbb R}^n}$.

Define   ${g: W\rightarrow V}$   by   ${g(w) = C_B^{-1} T_{A^{-1}} C_D (w)}$   for any   ${w\in W}$.

Our goal is to show that   ${g\circ T= 1_V}$   and   ${T\circ g = 1_W}$.

By Thm 9.1.3,   ${M_{BB} (gT)= M_{BD}(g) M_{DB}(T)}$.

By our definition of ${A}$, ${M_{DB}(T) = A}$.

Claim:   ${M_{DB}(T) = A}$   and   ${M_{BD}(g)= A^{-1}}$.

Proof:

Firstly, we have ${C_B(g(w))= M_{BD}(g) C_D(w)}$, ${\forall}$ ${w\in W}$.

Next, from the definition of ${g}$ (i.e. ${g(w) = C_B^{-1} T_{A^{-1}} C_D (w)}$), we get

${C_Bg=T_{A^{-1}}C_D.}$

Hence, ${C_B(g(w)) = A^{-1} C_D(w)}$

Thus ${M_{BD}(g) C_D(w)= A^{-1}C_D(w)}$, ${\forall}$ ${w\in W}$.

Write ${D=[d_1,\cdots, d_n]}$.

Set ${w=d_1}$, then ${M_{BD}(g) e_1= M_{BD}(g) C_D(d_1)= A^{-1}C_D(d_1)= A^{-1}e_1}$.

i.e. The 1st columns of ${M_{BD}(g)}$ and ${A^{-1}}$ are identical.

Repeating the argument for ${d_2,\cdots, d_n}$, we get ${M_{BD}(g)=A^{-1}}$.

Thus ${M_{BB}(gT)= A^{-1}A= I}$ (the ${n\times n}$ identity matrix). Hence ${gT=1_V}$.

[To see why ${gT=1_V}$, you may argue as follows: For all ${v\in V}$,

$\displaystyle C_B(gT(v))= M_{BB}(gT) C_B(v) = I C_B(v)= C_B(v).$

As ${C_B}$ is an isomorphism, ${C_B(gT(v))= C_B(v)}$ implies ${gT(v)=v}$, which holds for all ${v\in V}$.]

Repeat the above argument for the following diagram.

We get ${Tg=1_W}$.

Remarks:
See Lect17.pdf for lecture slides.

After Class Exercises: Ex 9.1 Qn 1(b), 1(d), 2(b), 4(b), 4(d), 4(f), 5(b), 5(d), 7(b), 7(d). (See textbook’s solution.)

Ex 9.1 Qn 14, 15. (See L17-ace.pdf)

Revision: Ex 7.3 Qn 20, 21. (See L17-ace.pdf)