Linear transformations and matrices

What makes a transformation "linear"?

  • Transformation of summed vectors, is equal to the sum of their individual transformation.

  • And scaling preserved

Matrics

  • We use matrics to describe linear transformations numerically.

  • The effect of the transformation on all vectors can be summarized by what the transformation does the unit vectors of i^,j^\hat{i},\hat{j}

  • By knowing what the transformation does to unit vectors, we can extend it to all other vectors.

Example

Suppose you get the data descibing what happens to i^,j^\hat{i},\hat{j}

Extend this transformation to some vector, v⃗=(−12)\vec{v} = \begin{pmatrix} -1 \\ 2 \end{pmatrix}

How can you then define linear transformation generically:

Matrix Multiplication

Order of transformation matters!

  1. Applying transformation B then A, would be different from applying transformation A then B. Hence non-commutative.

  2. Applying transformation C, then B, finally A; brackets do not alter the order of transformation. Therefore, associative.

Last updated

Was this helpful?