The orthogonal transformations Ppq annihilate element (p,q) of an object matrix.
Successive transformations undo previously set zeros, but the off-diagonal terms eventually get smaller and smaller, until they get nearly zero, and the only thing that's left is the diagonal with eigenvalues.
Taking the product of all the transformations Ppq yields the matrix of eigenvectors.
The method is absolutely foolproof for symmetric real matrices. But it is slow. Painfully slow for matrices larger than .
It is a simple and stable algorithm though, and it parallelises well too.
The Jacobi rotation matrix has the form:
|c2 + s2 = 1||(3.8)|
It is quite easy to see what the effect of equation (3.9) is going to be on selected terms.
First we need to come up with an expression that describes
a generic term of matrix
Ppq in terms of
Now we can evaluate a'rp for, say,
and (assume summation over dummy indexes i and j)
|a'rp||=||P-1ri aij Pjp = Pir aij Pjp|
|a'pp||=||c2 app + s2 aqq - 2 s c apq||(3.14)|
|a'qq||=||s2 app + c2 aqq + 2 s c apq||(3.15)|
The purpose of the Jacobi rotation
Ppq is to
kill a'pq. Thus:
Now, observe that:
Next recall the following simple algebraic identity:
Solving equation (3.22) with respect to t yields the following:
Now, this solution can be rewritten in a computationally more convenient
form. Consider the + case:
Once we have the t, we get c and s as follows3.1
Equations (3.12) through (3.16) are now rewritten to
minimize the round off error and to make them look like
the new quantity is equal to the old one plus a small correction.
|app'||=||c2 app + s2 aqq - 2scapq|
|=||app - t apq||(3.28)|
Similarly from the same equation (3.26) we get:
|aqq'||=||s2 app + c2 aqq + 2 s c apq|
|=||aqq + t apq||(3.30)|
For a'rp and a'rq the computation is more trivial:
|a'rp||=||c arp - s arq|
|=||arp + (c - 1) arp - s arq|
|a'rq||=||c arq + s arp|
|=||arq + (c - 1) arq + s arp|