Singular value decomposition, a method for factorization of a matrix, is one of the most useful linear algebra tools and it has many applications. Here I will describe SVD decomposition of an image, but first some background.

(Side note - you can find all of this code on my Github: https://github.com/JTDean123)

SVD is the factorization of a real *m x n* matrix *A* into three vectors:

a

*m x m*matrix*U*containing the left singular vectors of*A*. The left singular vectors of*A*are the set of orthonormal eigenvectors of*AA*.^{T}a diagonal

*m x n*matrix*S*containing the non-zero singular values (the square root of the eigenvalues) of*A*. By custom, the values of*S*are arranged in descending order.a

*n x n*matrix*U*containing the right singular vectors of*A*. The right singular vectors of*A*are the set of orthonormal eigenvectors of*A*.^{T}A

Taken together, this means that *A* can be deconstructed as:

*A*=*USV*^{T}

OK great, but why is this useful? This question will be easier to answer if we go through a simple example. Consider a *4x3* matrix A that we wish to decompose with SVD:

\[\mathbf{A} = \left[\begin{array} {rrr} 4 & 0 & 8 \\ 0 & 6 & 1 \\ 2 & 0 & 6 \\ 0 & 3 & 0 \end{array}\right]\]

The singular value decomposition, or *A* = *USV ^{T}*, (svd(A) in R) yields the following result:

\[\mathbf{U} = \left[\begin{array} {rrr} 0.81 & -0.10 & 0.57 \\ 0.12 & 0.89 & -0.10 \\ 0.57 & -0.06 & -0.79 \\ 0.02 & 0.45 & 0.21 \end{array}\right] \mathbf{S} = \left[\begin{array} {rrr} 10.99 & 0 & 0 \\ 0 & 6.69 & 0 \\ 0 & 0 & 0.752 \end{array}\right] \mathbf{V} = \left[\begin{array} {rrr} 0.40 & -0.08 & 0.91 \\ 0.07 & 1.0 & 0.05 \\ 0.91 & -0.04 & -0.40 \end{array}\right]\]

The first thing that we notice is that the first two singular values, 10.99 and 6.69, are > 95% of the total sum of the singular values. The magnitude of the singular values are proportional to how much information is contained in them, meaning we could reconstruct ~95% of *A* with only 2 of the 3 singular values. Furthermore, we see that the first singular value is 60% of the total sum of the singular values. This means, intuitively, that about two thirds of our data can be explained with one singular value. Looking at the matrix *A* this makes sense, as columns 1 and 3 are similar. We can quantify this by looking at the transpose of *V*:

\[\mathbf{V*} = \left[\begin{array} {rrr} 0.40 & 0.07 & 0.91 \\ -0.08 & 1.0 & -0.04 \\ 0.91 & 0.05 & -0.40 \end{array}\right]\]

The first two rows are those multiplied by the first two singular values of *S*. We see that columns 1 and 3 are very similar in these first two rows, indicating, as we can intuitively see, that columns 1 and 3 are similar. The implications of this are quite numerous. Imagine if the rows of matrix *A* contained customers and the columns represented movie reviews. There could be hundreds or thousands of movies, so how do we make sense of this? Imagine that we calculate the SVD and find that 8 singular values account for 95% of the data. This means that we have 8 different types of movies in the data, possibly corresponding to â€˜horrorâ€™, â€˜actionâ€™, and so on. We can also apply this type of deconstruction to an image, and this is what I will discuss below.

For this analysis we will use an image that I took on a recent hiking adventure to Joshua Tree National Park. Sunny deserts are a a good place to go during the Seattle winter! First we load the image into R and plot the original.

```
jTree <- readJPEG('jTree.JPG')
plot(imagematrix(jTree))
```