Unit LA4 has been the most challenging one yet for me. I’ve been working through it on and off for a few weeks now, and it’s been heavy going! Still, the topic itself is very interesting, much more so that the material on linear transformations in MS221.

I think the main reason for that is the fact MS221 concentrates on linear transformations of the plane, so I’m used to thinking of linear transformations as just being rotations, reflections, translations. Now, I love group theory and I love working with symmetries, but M208 has shown me that linear transformations are much more than that.

I mean, we’re all used to linear transformations that map points in \mathbb{R}^2 to other points in \mathbb{R}^2, that’s basically what we got in MS221. And it’s fairly straightforward to imagine mapping points in \mathbb{R}^3, say, to points in \mathbb{R}^2 – like projecting the 2D shadow of a 3D object, or the 3D shadow of a 4D hypercube. But mapping a point in \mathbb{R}^n to a polynomial in P_n? That’s crazy!

And then there’s the wonderful Dimension Theorem:

\textup{Let } t:V \rightarrow W \textup{ be a linear transformation. Then}

\dim \textup{Im}(t) + \dim \ker (t) = \dim V.

Very simple to state, and very useful if you know two out of the three dimensions it refers to. Perhaps I’m easily amused, but I love the fact that if the image of t and the domain of t have the same dimension, then we know that the kernel of t must be the set containing only the zero vector. And if the kernel only contains the zero vector, then we also know that t is one-one! Awesome!

Given how much trouble I had wrapping my head around the various results and strategies in this unit, I was expecting the associated TMA question to be quite gruelling, but it turned out to be fairly manageable. Not easy, but not as fiendish as they could have made it, I think. Or perhaps its fiendishness is subtle and will only reveal itself once I get my marked assignment back…

Anyway, this week I’ll be moving on to Unit LA5, the final linear algebra unit. It’s called Eigenvectors, but I’ve had a peek at the related TMA question, and that seems to be about quadrics – I wonder what the relationship between eigenvectors and classifying quadrics is. Hopefully it won’t take me another month to work though this unit and find out!

About these ads