# SVD Visualization Tool for 2-by-2 Matrices

December 20, 2018

Roughly a year ago, I took a class on matrix theory. Back then I was struggling to understand and visualize the effect of the SVD on a matrix. Inspired by Nicky Case's “Magnificent 2D Matrix” explorable, I decided to make a small JavaScript tool to visualize what an SVD does in terms of 2D vector transformation. Now that I have this website, I decided to clean it up a bit and expose it here.

You can find the visualizer directly below. In case you don't know what an SVD is, I wrote a (very brief) explanation of the fundamentals at the end of this blogpost. Note that a basic understanding of linear algebra is required.

## Two-dimensional SVD Visualizer

$$M$$
Arbitrary Matrix
 $$x$$ $$y$$
=
$$U$$
Isometry
$$\Sigma$$
Scaling
$$V^T$$
Isometry
 $$x$$ $$y$$

Each dot on the chart above represents a single vector $$(x, y)^T$$, after it has been transformed by the matrix $$M$$. If you haven't changed anything, $$M$$ is set to the identity, so each vector is displayed in its inital position without transformation.

Below the chart is a mathematical expression. On the left-hand side is the matrix $$M$$, shown multiplying an arbitrary vector $$(x, y)^T$$, representing the vectors on the chart in their initial position. You can adjust here the coefficients of $$M$$ by clicking and dragging over the input. This, in turn, will change the location of the vectors on the chart. You can also click the following links to set $$M$$ to some pre-defined matrix:

On the right-hand side of the expression is the SVD of $$M$$, again shown multiplying $$(x, y)^T$$. The SVD splits $$M$$ into an isometry $$V$$ (i.e., a rotation and/or symmetry), followed by a scaling $$\Sigma$$, followed by another isometry $$U$$.

Hovering your mouse over various elements in this expression allows you to witness the effect of each part of the SVD:

• Hovering over any of the two $$(x, y)^T$$ temporarily brings the vectors back to their initial position.
• Hovering over $$V^T$$ shows the effect of applying only the first isometry $$V$$ to the vectors.
• Hovering over $$\Sigma$$ shows the effect of applying first $$V^T$$, then $$\Sigma$$ (or equivalently, applying $$\Sigma V^T$$) to the vectors.
• Anywhere else just shows the effect of $$M$$ (which is equivalent to applying $$V^T$$ then $$\Sigma$$ then $$U$$).

## What's an SVD Anyway?

One way to see a matrix is as a (linear) function that maps a vector onto another vector—a so-called linear transform—where the function is applied by means of matrix-vector multiplication. While simple, matrices can represent relatively complex operations. In particular, as the size of a matrix grows, it becomes increasingly difficult to understand what that matrix does.

Enter the SVD, short for singular value decomposition. The SVD breaks down any matrix $$M$$ into three simpler matrices $$U$$, $$\Sigma$$ and $$V$$, such that $$M = U \Sigma V^T$$.

In terms of linear transform, $$U$$ and $$V$$ are called isometries. These are operations that preserve distances. In two- or three-dimensional space, an isometry is just a rotation and/or a symmetry.

The matrix in the middle, $$\Sigma$$ (“sigma”), represents a scaling along the basis axes (e.g., the x-y axes of a 2D plot). It is a positive, diagonal matrix (meaning all entries outside the diagonal are zero). Starting from the top-left corner, the first entry is the scaling factor along the first axis (typically, $$x$$). The second entry is the scaling factor along the second axis (typically, $$y$$). And so on. These are called singular values.

Hence, any matrix can be represented as an isometry, followed by a scaling along the basis axes, followed by another isometry. This result allows us to derive some important facts about a matrix: namely, how much it scales vectors, and along which axes. This turns out to be a pretty big deal in data science, where the SVD is the basis for a tool called principal component analysis.