Linear Algebra

Since we already have SVD, do we need URV factorization?

The SVD factorization is a special case of the URV factorization that has many great properties. The latter decomposes a matrix $A_{m\times n}$ as $$A=URV^T,$$ where $U=(U_1|U_2)$ is an orthonormal matrix such that $U_1$ is a basis for $R(A)$, $U_2$ is a basis for $N(A^T)$; and $V=(V_1|V_2)$ is another orthonormal matrix such that $V_1$ is a basis for $R(A^T)$ and $V_2$ is a basis for $N(A)$. Suppose that the rank of $A$ is $r$, in which case $U_1$ has $r$ columns and $V_1$ has $r$ rows.

According to the fundamental theorem of linear algebra, we have that $R(A)\perp N(A^T)$, and $R(A^T)\perp N(A)$. This leads to the fact that $R$ is of the form $$R=\begin{pmatrix} C & 0 \\ 0 & 0 \end{pmatrix},$$ where $C$ is nonsingular.

Let’s return to the question set forth in the title: Since we have the SVD factorization, which is a special case of URV with great properties thanks to $C$ being diagonal, shall we still care about the URV factorization? Or is URV just an evolutionary intermediate species that we leave behind?

The answer is, as you probably guessed, there is still some advantage to having the URV factorization in mind. I do not know if it would lead to any computation advantages or have any application value, but it –at least– has theoretical value, as I discovered in Section 15.5 of Carl D. Meyer’s book.

The main difference between URV and SVD that we highlight in this article is that, while SVD is unique, URV is not. Being unique is sometimes advantageous and sometimes it’s not. Being unique means that we do not have a choice over the bases $U$ or $V$. The exact opposite holds for the URV factorization: any orthonormal bases $U$ and $V$ that satisfy the conditions listed in the first paragraph of this article can be used. This is an advantage whenever we are given some specific $U$ and $V$ bases, because it means that we’ll be able to perform the URV decomposition.

The latter fact is used for proving that the angle between complementary subspaces $\mathcal M$ and $\mathcal N$, namely $\theta_{\text{min}}$, satisfies $$\theta_{\text{min}}=\frac{1}{||P_{\mathcal{MN}}||},$$ where $P_{\mathcal{MN}}$ is the oblique projector onto $\mathcal M$ along $N$. We won’t go through the proof, but highlight the role of URV in it. The proof starts by constructing orthogonal projectors onto $\mathcal{M}$ and $\mathcal N$, which are $P_{\mathcal M}=U_1 U_1^T$ and $P_{\mathcal N}=V_2 V_2^T$ for some orthonormal matrices $U=(U_1|U_2)$ and $V=(V_1|V_2)$. And the key part is that, we are able to claim that $$P_{\mathcal{MN}}=U\begin{pmatrix} C & 0 \\ 0 & 0\end{pmatrix}V$$ for some nonsingular matrix $C$, and this is thanks to the URV factorization; it allowed us to do the decomposition even though we didn’t know what they were — it sufficed that they satisfy the conditions in the first paragraph here.