Any real matrix, symmetric or not, can be decomposed into either
of the forms:
It is easy to see how such a decomposition can be eventuated by
applying, for example, Householder rotations or Givens rotations
A, in order to annihilate everything below
the diagonal. Observe that these will not be full similarity
transformations, in other words they are not going to be
that we have acted with
from the right in order to do to rows, what application of
A from the left did to the columns.
So if we just leave
be left with a new matrix
with, say, k-th
column zeroed under the diagonal.
An accumulation of those
PiT is some combined rotation
QT, such that
Now, once we have done all that hard work and found both
R, by merely changing the
order, i.e., replacing
we eventuate a
similarity transformation on
This is called the QR transformation of matrix A and it has the following properties:
The QR algorithm now works as follows. Once you have a tridiagonal matrix A, you
Which is why the combination of the Householder triadiagonalization followed by the QR transformations is such an efficient method for finding eigenvalues of a symmetric matrix.
If there is a degenerate eigenvalue, i.e., a value of multiplicity p, then at the end of this procedure there will be p-1 non-vanishing elements on the super and sub-diagonal, that correspond to that value. Those elements then determine a submatrix that can be cut out of the original matrix A and diagonalized separately, for example, by evaluating a characteristic polynomial, or by Jacobi iterations.
There is nothing sacred about QR versus QL, so everything that's been said about QR works just as well for QL. But when you write a program, of course, you must choose one or the other. Since academics have a long history of leaning towards the left, we're going to choose QL in our example code too.