*Notes to a video lecture on http://www.unizor.com*

__Matrices+ 01__

Matrix as Operator

Matrix as Operator

In the beginning matrices were just tables of real numbers with

*m*rows and

*n*columns, we called them

*m*⨯

*n*matrices.

We knew how to add two matrices with the same numbers of rows and columns, how to multiply a matrix by a scalar number and how to multiply an

*m*⨯

*n*matrix by an

*n*⨯

*k*matrix.

Here is a short recap.

*Addition of matrices*

Let

*be an*

**A**_{m⨯n}*m*⨯

*n*matrix with elements

*a*, where the first index signifies the row, where this element is located and the second index is a column.

_{i,j}Let

*be another*

**B**_{m⨯n}*m*⨯

*n*matrix with elements

*b*.

_{i,j}Then a new matrix

*is an*

**C=A+B***m*⨯

*n*matrix with elements

*c*.

_{i,j}=a_{i,j}+b_{i,j}*Multiplication of a matrix by a scalar*

Let

*s*be any real number (scalar) and

*be an*

**A**_{m⨯n}*m*⨯

*n*matrix with elements

*a*.

_{i,j}Multiplication of

*s*by

*produces a new matrix*

**A**_{m⨯n}*with elements*

**B**_{m⨯n}*b*.

_{i,j}=s·a_{i,j}Obviously, since the product of numbers is commutative and associative, the product of a matrix by a scalar is commutative, and the product of a matrix by two scalars is associative.

*Product of two matrices*

Let

*be an*

**A**_{m⨯n}*m*⨯

*n*matrix with elements

*a*.

_{i,j}Let

*be an*

**B**_{n⨯k}*n*⨯

*k*matrix with elements

*b*.

_{i,j}It's important for a definition of a product of matrices that the number of columns in the first matrix

*equals to a number of rows in the second*

**A**_{m⨯n}*(in our case this number is*

**B**_{n⨯k}*n*.)

The product

*of*

**C**_{m⨯k}*and*

**A**_{m⨯n}*is*

**B**_{n⨯k}*m*⨯

*k*matrix with elements

*c*calculated as sums of products of

_{p,q}*n*elements of

*p*row of matrix

^{th}*by*

**A**_{m⨯n}*n*elements of

*q*column of matrix

^{th}

**B**_{n⨯k}∀

*p*∈[

*1,m*], ∀

*q*∈[

*1,k*]:

*c*Σ

_{p,q}=_{[1≤t≤n]}

*a*

_{p,t}·b_{t,q}It's important to note that, generally speaking, multiplication of two matrices

**IS NOT**commutative, see

*Problem C*below.

The product of three matrices, however, is associative, see

*Problem B*below.

For our purposes we will only consider "square" matrices with the same number of rows and columns

*n*and

*n*-dimensional Euclidean vector space of sequences of

*n*real numbers organized in a row (

*row-vector*, which can be viewed as

*1*⨯

*n*matrix) or in a column (

*column-vector*, which can be viewed as

*n*⨯

*1*matrix).

What happens if we multiply an

*n*⨯

*n*matrix by a column-vector, which we can consider as an

*n*⨯

*1*matrix, according to the rules of multiplication of two matrices?

In theory, the multiplication is possible because the number of columns in the first matrix

*n*equals to the number of rows in the second one (the same

*n*) and, according to the rules of multiplication, the resulting matrix should have the number of rows of the first matrix

*n*and the number of columns of the second one, that is

*1*. So, the result is an

*n*⨯

*1*matrix, that is a column-vector.

As we see, multiplication of our

*n*⨯

*n*matrix by a column-vector with

*n*rows results in another column-vector with

*n*row.

In other word, multiplication by

*n*⨯

*n*matrix

**on the left**transforms one column-vector into another, that is this multiplication represents an

**operation**in

*n*-dimensional vector space of column-vectors. The

*n*⨯

*n*matrix itself, therefore, acts as an

**operator**in the vector space of column-vectors of

*n*components.

Similar operations can be considered with row-vectors and their multiplication by matrices. The only difference is the order of this multiplication.

In a case of column-vectors we multiplied

*n*⨯

*n*matrix by

*n*⨯

*1*column-vector getting another

*n*⨯

*1*column-vector.

In case of row-vectors, we change the order and multiply a

*1*⨯

*n*row-vector by

*n*⨯

*n*matrix

**on the right**, getting another

*1*⨯

*n*row-vector.

Therefore, an

*n*⨯

*n*matrix can be considered an

**operator**in a vector space of row-vectors, if we apply the multiplication by matrix from the right.

Let's examine the properties of the transformation of

*n*⨯

*1*column-vectors by multiplying them by

*n*⨯

*n*matrices.

*Problem A*

Prove that multiplication by an

*n*⨯

*n*matrix is a linear transformation in the

*n*-dimensional Euclidean space ℝ

*of*

^{n}*n*⨯

*1*column-vectors.

(a) ∀

*∈ℝ*

**u***, ∀*

^{n}*K*∈ℝ, ∀

*:*

**A**_{n⨯n}

**A·(**K**·u) =**K**·(A·u) = (**K**·A)·u**(b) ∀

*∈ℝ*

**u,v***, ∀*

^{n}*:*

**A**_{n⨯n}

**A·(u+v) = A·u + A·v**(c) ∀

*∈ℝ*

**u***, ∀*

^{n}*,*

**A**_{n⨯n}*:*

**B**_{n⨯n}

**(A+B)·u = A·u + B·u***Hint A*

The proof is easily obtained straight from the definition of matrix multiplications.

*Problem B*

Prove that consecutive multiplication by two

*n*⨯

*n*matrices is associative in the

*n*-dimensional Euclidean space ℝ

*of*

^{n}*n*⨯

*1*column-vectors.

∀

*∈ℝ*

**u***, ∀*

^{n}*,*

**A**_{n⨯n}*:*

**B**_{n⨯n}

**A·(B·u) = (A·B)·u***Hint B*

It's all about changing of an order of summation.

Let's demonstrate it for

*n=2*.

Given two matrices

*and*

**A***and a column-vector*

**B***.*

**u**Matrix

**A**a_{1,1} | a_{1,2} |

a_{2,1} | a_{2,2} |

Matrix

**B**b_{1,1} | b_{1,2} |

b_{2,1} | b_{2,2} |

Column-vector

**u**u_{1} |

u_{2} |

Let

*and*

**v=B·u**

**w=A·(B·u)=A·v**Then column-vector

*'s components are*

**v***v*

_{1}= b_{1,1}·u_{1}+ b_{1,2}·u_{2}*v*

_{2}= b_{2,1}·u_{1}+ b_{2,2}·u_{2}The components of vector

*are*

**w***w*

= a

+ a

= a

+ a

= (a

+ (a

_{1}= a_{1,1}·v_{1}+ a_{1,2}·v_{2}== a

_{1,1}·(b_{1,1}·u_{1}+ b_{1,2}·u_{2}) ++ a

_{1,2}·(b_{2,1}·u_{1}+ b_{2,2}·u_{2}) == a

_{1,1}·b_{1,1}·u_{1}+ a_{1,1}·b_{1,2}·u_{2}++ a

_{1,2}·b_{2,1}·u_{1}+ a_{1,2}·b_{2,2}·u_{2}== (a

_{1,1}·b_{1,1}+ a_{1,2}·b_{2,1})·u_{1}++ (a

_{1,1}·b_{1,2}+ a_{1,2}·b_{2,2})·u_{2}*w*

= a

+ a

= a

+ a

= (a

+ (a

_{2}= a_{2,1}·v_{1}+ a_{2,2}·v_{2}== a

_{2,1}·(b_{1,1}·u_{1}+ b_{1,2}·u_{2}) ++ a

_{2,2}·(b_{2,1}·u_{1}+ b_{2,2}·u_{2}) == a

_{2,1}·b_{1,1}·u_{1}+ a_{2,1}·b_{1,2}·u_{2}++ a

_{2,2}·b_{2,1}·u_{1}+ a_{2,2}·b_{2,2}·u_{2}== (a

_{2,1}·b_{1,1}+ a_{2,2}·b_{2,1})·u_{1}++ (a

_{2,1}·b_{1,2}+ a_{2,2}·b_{2,2})·u_{2}Let's calculate the product of two matrices

**A·B**Matrix

*:*

**A·B**a_{1,1}·b_{1,1}+a_{1,2}·b_{2,1} | a_{1,1}·b_{1,2}+a_{1,2}·b_{2,2} |

a_{2,1}·b_{1,1}+a_{2,2}·b_{2,1} | a_{2,1}·b_{1,2}+a_{2,2}·b_{2,2} |

Notice that coefficients at

*u*and

_{1}*u*in expression for

_{2}*w*above are the same as elements of the table

_{1}*of the first row.*

**(A·B)**Analogously, coefficients at

*u*and

_{1}*u*in expression for

_{2}*w*above are the same as elements of the table

_{2}*of the second row.*

**(A·B)**That means that, if we multiply matrix

*by a column-vector*

**(A·B)***, we will get the same vector*

**u***as above.*

**w**That proves the associativity for a two-dimensional case.

*Problem C*

Prove that consecutive multiplication by two

*n*⨯

*n*matrices, generally speaking,

**IS NOT**commutative.

That is, in general,

**(A·B)·u) ≠ (B·A)·u***Proof C*

To prove it, it is sufficient to present a particular case when a product of two matrices is not commutative.

Consider matrix

**A**1 | 2 |

3 | 4 |

Matrix

**B**−1 | 2 |

−3 | 4 |

Then matrix

*will be*

**A·B**−7 | 10 |

−15 | 22 |

Reverse multiplication

*will be*

**B·A**5 | 6 |

9 | 10 |

As you see, matrices

*and*

**A·B***are completely different, and, obviously, their product with most vectors will produce different results.*

**B·A**Take a vector with components (

*1,0*), for example. Matrix

*will transform it into (*

**A·B***−7,−15*), while matrix

*will transform it into (*

**B·A***5,9*).

End of Proof.

## No comments:

Post a Comment