Data Science: Computation of Eigenvectors — QR-Algorithm Implementation.

applied.math.coding
3 min readApr 26

This story is part of my Data Science series.

In our previous story (see here) we have looked at the theory behind the QR-algorithm that aims to find all eigenvector/eigenvalue pairs of a matrix. Now we consider a possible implementation in Rust.

There are two steps of ‘deep learning’ an algorithm. First, get a rough understanding of its theory. Second, try to implement it. For those who are mainly interested in applications, the second part might be already sufficient. As such, the implementation we will see here is meant for pure educational purposes only. There exist way more optimized versions in standard libraries.

QR decomposition:

Remember, the QR-algorithm heavily depends on the QR decomposition. Therefore let us start by implementing this.

The dependencies will be the Rust-standard matrix library:

[dependencies]
ndarray = "0.15.0"

Given a regular matrix A, the QR decomposition aims at a decomposition of the for A = QR where Q is an orthonormal matrix and R is upper triangular. This can be easily achieved by applying the Gram-Schmidt procedure on the columns of A. Something like that we already did in this story, but for convenience let me paste it to here:

pub fn compute_orthonormal_base(a: &Array2<f64>) -> Array2<f64> {
let mut q = Array2::zeros((a.nrows(), 0));
for i in 0..a.ncols() {
let v_i = a.column(i);
let mut u_i = v_i.to_owned();
for j in 0..i {
let u_j = q.column(j);
u_i = u_i - v_i.t().dot(&u_j) * &u_j;
}
u_i = &u_i / (&u_i.t().dot(&u_i)).sqrt();
q.push_column(u_i.view()).unwrap();
}
q
}

The returned matrix is already the Q. Its columns are an orthonormal basis and it has the property that all the columns of A up to index i are within the orthogonal complement of the columns of Q with index greater than i.

applied.math.coding