Reflectors should be your second nature
Reflectors are a class of matrices that are not introduced in all linear algebra textbooks. However, the book of Carl D. Meyer uses this class of matrices heavily for various fundamental results. Indeed, the book uses reflectors for theoretical reasons, such as proving the existence of fundamental transformations like SVD, QR, Triangulation, or Hessenberg decompositions (more to come below), as well as applications, such as the implementation of the QR algorithm via the Householder transformation or solution of large-scale linear systems…
How to compute basis for the range, nullspace etc. of a matrix? 6 Approaches
The four fundamental spaces of a matrix $A$, namely the range and the nullspace of itself $A$ or its transpose $A^T$, are the heart of linear algebra. We often find ourselves in need of computing a basis for the range or the nullspace of a matrix, for theoretical or applicational purposes. There are many ways of computing a basis for the range or nullspace for $A$ or $A^T$. Some are better for application, either due to their robustness against floating point…
How to compute the “real” rank of a matrix?
If you fill an $n\times n$ matrix with random entries, than you’ll almost surely end up with a full-rank matrix. Also, any matrix that is constructed with real and continuous data (e.g., sensor input) will also be almost surely of full rank even if the underlying should have lead to linearly dependent columns/rows. Further, if we do not use exact arithmetic but, say, floating point arithmetic, our $\mathbf A$ will almost surely be somewhat perturbed, especially if it is a result…
How to prove SVD ? A recipe approach
To prove the existence of SVD is no trivial task, but it turns out that it’s not too difficult either. Looks like one needs a few ingredients (hence the title), but once we know them and understand the overall idea, the proof is not too difficult. Below we list the basic ingredients needed to prove the existence of SVD. The URV decomposition $\mathbf{A} = \mathbf{URV} = \mathbf{U}\begin{pmatrix}\mathbf C & \mathbf 0 \\ \mathbf 0 & \mathbf 0\end{pmatrix}\mathbf{V}$ $||\mathbf{A}||_2 = ||\mathbf{URV}||_2 =…
Since we already have SVD, do we need URV factorization?
The SVD factorization is a special case of the URV factorization that has many great properties. The latter decomposes a matrix $A_{m\times n}$ as $$A=URV^T,$$ where $U=(U_1|U_2)$ is an orthonormal matrix such that $U_1$ is a basis for $R(A)$, $U_2$ is a basis for $N(A^T)$; and $V=(V_1|V_2)$ is another orthonormal matrix such that $V_1$ is a basis for $R(A^T)$ and $V_2$ is a basis for $N(A)$. Suppose that the rank of $A$ is $r$, in which case $U_1$ has $r$ columns…