Monday, September 16, 2024

Matrices+ 02 - Eigenvalues: UNIZOR.COM - Math+ & Problems - Matrices

Notes to a video lecture on http://www.unizor.com

Matrices+ 02
Matrix Eigenvalues


The concepts addressed in this lecture for two-dimensional real case are as well applicable to N-dimensional spaces and even to real or complex abstract vector spaces with linear transformations defined there.
Presentation in a two-dimensional real space is chosen for its relative simplicity and easy exemplification.

Let's consider a 2⨯2 matrix A as a linear operator in the two-dimensional Euclidean vector space. In other words, multiplication of any vector v on a coordinate plane by this 2⨯2 matrix A linearly transforms it into another vector on the plane w=A·v.
Assume, matrix A is
56
610

Let's see how this linear operator works, if applied to different vectors.

We will use a row-vector notation in the text for compactness, but column-vector notation in the transformation examples below.
The coordinates of our vectors we will enclose into double bars, like matrices, because a row-vector is a matrix with only one row, and a column-vector is a matrix with only one column.

Our first example of a vector to apply this linear transformation is v=||1,1||.
56
610
·
1
1
=
11
16
Obviously, the resulting vector w=||11,16|| and the original one v=||1,1|| are not collinear.

Applied to a different vector v=||3,−2||, we obtain somewhat unexpected result
56
610
·
3
−2
=
3
−2
Interestingly, the resulting vector w=||3,−2|| and the original one are the same. So, this operator leaves this particular vector in place. In other words, it retains the direction of this vector and multiplies its magnitude by a factor of 1.

Finally, let's applied our operator to a vector v=||2,3||.
56
610
·
2
3
=
28
42
Notice, the resulting vector w=||28,42|| is the original one v=||2,3|| multiplied by 14. So, this operator transforms this particular vector to a collinear one, just longer in magnitude by a factor of 14.

As we see, for this particular matrix we found two vectors that, if transformed by this matrix as by a linear operator, retain their direction, while change the magnitude by some factor.
These vectors are called eigenvectors. For each eigenvector there is a factor that characterizes the change in its magnitude if this matrix acts on it as an operator. This factor is called eigenvalue. This eigenvalue in the example above was 1 for v=||3,−2||, and 14 for v=||2,3||.

There are some questions one might ask.
1. Are there always some particular vectors that retain the direction if transformed by some particular matrix?
2. If yes, how to find them and how to find the corresponding multiplication factors?
3. How many such vectors exist, if any?
4. How to find all the multiplication factors for a particular matrix transformation?

Let's analyze the linear transformation by a matrix that leaves the direction of a vector without change, just changes the magnitude by some factor λ.

Assume, we have a matrix A=||ai,j||, where i,j∈{1,2}, in our two-dimensional Euclidean space.
This matrix converts any vector v=||v1,v2|| into some other vector, but we are looking for such vector v that is converted by this matrix into a collinear one.
If matrix A transforms vector v to a collinear one with the magnitude of the original one multiplied by a factor λ, the following matrix equation must hold
A·v = λ·v
or in coordinate form
a1,1a1,2
a2,1a2,2
·
v1
v2
=
λ
·
v1
v2
which is equivalent to
(a1,1−λ)·v1 + a1,2·v2 = 0
a2,1·v1 + (a2,2−λ)·v2 = 0

This is a system of two linear equations with three unknowns λ, v1 and v2.

One trivial solution would be v1=0 and v2=0, in which case λ can take any value.
This is not a case worthy of analyzing.

If the matrix of coefficients of this system has a non-zero determinant, this trivial solution would be the only one.
Therefore, if we are looking for a non-trivial solution, the matrix's determinant must be zero, which gives a specific condition on the value of λ.

Therefore, a necessary condition for existence of other than null-vector v is
(a1,1−λ)·(a2,2−λ) − a1,2· a2,1 = 0
or
λ² − (a1,1+a2,2)·λ +
+ a1,1·a2,2−a1,2·a2,1 = 0


Since we are looking for real values of λ, we have to examine a discriminant D of his quadratic equation.
D =
=(a1,1+a2,2)²−4·(a1,1·a2,2−a1,2·a2,1)
=(a1,1−a2,2)²+4·a1,2·a2,1

If D is negative, there are no real solutions for λ.
If D is zero, there is one real solutions for λ.
If D is positive, there are two real solutions for λ.

Consider now that we have determined λ and would like to find vectors transformed into collinear ones by matrix A with this exact factor of change in magnitude.

If some vector v=||v1,v2|| that is transformed into a collinear one with a factor λ exists, vector v, where s is any real non-zero number, would have exactly the same quality because of associativity and commutativity of multiplication by a scalar.
A·(s·v) = (A·s)·v = (s·A)·v =
=
s·(A·v) = s·(λ·v)= λ·(s·v)


Therefore, we don't need to determine exact values v1 and v2, we just need to determine only the direction of vector v=||v1,v2||, and this direction is determined by the factor v1/v2 or v2/v1 (to cover all cases, when one of them might be zero).

If v2≠0, the directions of a vector v and that of vector ||v1/v2,1|| are the same.
If v1≠0, the directions of a vector v and that of vector ||1,v2/v1|| are the same.

From this follows that, firstly, we can search for eigenvectors among those with v2≠0, restricting our search to vectors ||x=v1/v2,1||.
Then we can search for eigenvectors among those with v1≠0, restricting our search to vectors ||1,x=v1/v2||.
In both cases we will have to solve a system of two linear equations with two unknowns λ and x.

Searching for vectors ||x,1||
In this case the matrix equation that might deliver the required vector looks like this
a1,1a1,2
a2,1a2,2
·
x
1
=
λ
·
x
1
Performing the matrix by vector multiplication on the left side and scalar by vector on the right side and equating each component, we obtain a system of two equations with two unknowns - λ and x:
a1,1·x+a1,2·1 = λ·x
a2,1·x+a2,2·1 = λ·1

Take the right side of the second equation λ and substitute into the right side of the first equation, obtaining a quadratic equation for x:
a1,1·x+a1,2 = (a2,1·x+a2,2)·x
or
a2,1·x² + (a2,2−a1,1)·x − a1,2 = 0
Two solutions for this equations x1,2, assuming they are real values, produce two vectors ||x1,1|| and ||x2,1||, each of which satisfy the condition of collinearity after the matrix transformation.
Generally speaking, the factor λ will be different for each such vector.

Searching for vectors ||1,x||
In this case the matrix equation that might deliver the required vector looks like this
a1,1a1,2
a2,1a2,2
·
1
x
=
λ
·
1
x
Performing the matrix by vector multiplication on the left side and scalar by vector on the right side and equating each component, we obtain a system of two equations with two unknowns - λ and x:
a1,1·1+a1,2·x = λ·1
a2,1·1+a2,2·x = λ·x

Take the right side of the first equation λ and substitute into the right side of the second equation, obtaining a quadratic equation for x:
a2,1·1+a2,2·x = (a1,1·1+a1,2·x)·x
or
a1,2·x² + (a1,1−a2,2)·x − a2,1 = 0
Two solutions for this equations x1,2, assuming they are real values, produce two vectors ||1,x1|| and ||1,x2||, each of which satisfy the condition of collinearity after the matrix transformation.
Generally speaking, the factor λ will be different for each such vector.

Once again, let's emphasize important definitions.
Vectors transformed into collinear ones by a matrix of transformation are called eigenvectors or characteristic vectors for this matrix.
The factor λ corresponding to some eigenvector is called eigenvalue or characteristic value of the matrix and this eigenvector.

Let's determine eigenvectors and eigenvalues for a matrix A
56
610
used as an example above.

The quadratic equation to determine the multiplier λ for this matrix is
λ² − (a1,1+a2,2)·λ +
+ a1,1·a2,2−a1,2·a2,1 = 0

which amounts to
λ² − 15λ + 14 = 0
with solutions
λ1 = 1 and λ2 = 14

Let's find the eigenvectors of this matrix.
The quadratic equation for eigenvectors of type ||x,1|| is
6x² + (10−5)x − 6 = 0 or
6x² + 5x − 6 = 0 or
Solutions are
x1,2 = (1/12)·(−5±√25+4·36) =
= (1/12)·(−5±13)

Therefore,
x1 = 2/3
x2 = −3/2
Two eigenvectors are:
v1 = ||2/3,1|| which is collinear to vector ||2,3|| used in the example above and
v2 = ||−3/2,1|| which is collinear to vector ||3,−2|| used in the example above.

The matrix transformation of these eigenvectors are
56
610
·
2/3
1
=
28/3
14
But the resulting vector ||28/3,14|| equals to 14·||2/3,1||, which means that eigenvector ||2/3,1|| has eigenvalue 14.
56
610
·
−3/2
1
=
−3/2
1
But the resulting vector ||−3/2,1|| equals to eigenvector ||−3/2,1||, which means that eigenvector ||−3/2,1|| has eigenvalue 1.

Not surprisingly, both eigenvectors found above have eigenvalues already found (1 and 14).

The quadratic equation for eigenvectors of type ||1,x|| is
6x² + (5−10)x − 6 = 0 or
6x² − 5x − 6 = 0 or
Solutions are
x1,2 = (1/12)·(5±√25+4·36) =
= (1/12)·(5±13)

Therefore,
x1 = 3/2
x2 = −2/3
Two eigenvectors are:
v1 = ||1,3/2|| which is collinear to vector ||2,3|| used in the example above and
v2 = ||1,−2/3|| which is collinear to vector ||3,−2|| used in the example above.
So, we did not gain any new eigenvalues by searching for vectors of a form ||1,x||.

The above calculations showed that for a given matrix we have two eigenvectors, each with its own eigenvalue.

Based on these calculations, we can now answer the questions presented before.

Q1. Are there always some particular vectors that retain the direction if transformed by some particular matrix?
A1. Not always, but only if the quadratic equations for x
a2,1·x² + (a2,2−a1,1)·x − a1,2 = 0
and
a1,2·x² + (a1,1−a2,2)·x − a2,1 = 0
where ||ai,j|| (i,j∈{1,2}) is a matrix of transformation, have real solutions.

Q2. If yes, how to find them?
A2. Solve the quadratic equations above and, for each real solutions x of the first equation, vector ||x,1|| is an eigenvector and, for each real solutions x of the second equation, vector ||1,x|| is an eigenvector. Then apply the matrix of transformation to each eigenvector ||x,1|| or ||1,x|| and compare the result with this vector. It should be equal to some eigenvalue λ multiplied by this eigenvector.

Q3. How many such vectors exist, if any?
A3. As many as real solutions have quadratic equations above, but no more than two.
Incidentally, in three-dimensional case our equations will be polynomial of the 3rd degree, and the number of solutions will be restricted to three.
In N-dimensional case this maximum number will be N.

Q4. How to find all the multiplication factors for a particular matrix transformation?
A4. Quadratic equation for eigenvalues
λ² − (a1,1+a2,2)·λ +
+ a1,1·a2,2−a1,2·a2,1 = 0

can have 0, 1 or 2 real solutions.

The concept of eigenvectors and eigenvalues (characteristic vectors and characteristic values) can be extended to N-dimensional Euclidean vector spaces and even to abstract vector spaces, like, for example, a set of all real functions integrable on a segment ||0,1||.
The detail analysis of these cases is, however, beyond the current course, which aimed, primarily, to introduce advance concepts.


Problem A
Research conditions when a diagonal matrix (only elements along the main diagonal are not zero) has eigenvalues.

Solution A
Matrix of transformation A=||ai,j|| has zeros for i≠j.
So, it looks like this
a1,10
0a2,2

The equation for eigenvalues in this (a1,2=a2,1=0) case is
λ² − (a1,1+a2,2)·λ + a1,1·a2,2 = 0
with immediately obvious solutions
λ1=a1,1 and λ2=a2,2
So, the values along the main diagonal of a diagonal matrix are the eigenvalues of this matrix.

Determine the eigenvectors now among vectors ||x,1||.
Original quadratic equation for this case is
a2,1·x² + (a2,2−a1,1)·x − a1,2 = 0

With a2,1=a1,2=0 it looks simpler:
(a2,2−a1,1)·x = 0
From this we conclude that, if a2,2≠a1,1, the only solution is x=0, so our eigenvector is ||0,1||.
The eigenvalue for this eigenvector is a2,2.
If a2,2=a1,1, any x is good enough, so any vector is an eigenvector.

Determine the eigenvectors now among vectors ||1,x||.
Original quadratic equation for this case is
a1,2·x² + (a1,1−a2,2)·x − a2,1 = 0

With a2,1=a1,2=0 it looks simpler:
(a1,1−a2,2)·x = 0
From this we conclude that, if a2,2≠a1,1, the only solution is x=0, so our eigenvector is ||1,0||.
The eigenvalue for this eigenvector is a1,1.
If a2,2=a1,1, any x is good enough, so any vector is an eigenvector.

Answer A
If matrix of transformation is diagonal
a1,10
0a2,2
and a2,2≠a1,1,
the two eigenvectors are base unit vectors and the eigenvalues are a1,1 for base unit vector ||1,0|| and a2,2 for base unit vector ||0,1||.
In the case of a1,1=a2,2 any vector is an eigenvector with eigenvalue a1,1.


Problem B
Prove that symmetrical matrix always has real eigenvectors.

Solution B

Matrix of transformation A=||ai,j|| is symmetrical, which means a1,2=a2,1.

Recall that a necessary condition for existence of real eigenvalues λ is
(a1,1−λ)·(a2,2−λ) − a1,2· a2,1 = 0
or
λ² − (a1,1+a2,2)·λ +
+ a1,1·a2,2−a1,2·a2,1 = 0


Since we are looking for real values of λ, we have to examine a discriminant D of his quadratic equation.
D = (a1,1−a2,2)²+4·a1,2·a2,1
Since a1,2=a2,1, their product is non-negative, which makes the whole discriminant non-negative.
If D is zero, there is one real solutions for λ.
If D is positive, there are two real solutions for λ.
So, one or two solutions always exist.

No comments: