Previous: , Up: svd   [Contents][Index]


15.5.606.1 svd Applications

Applications of the singular value decomposition:

a nonnegative real number s is a singular value of matrix m if there exist unit-length vectors u and v such that m # v = s * u and transpose(m) # u = s * v. u and v are then called the left-singular and right-singular vectors for s. After svd, m, u2, s2, v2, the elements of s2 are the singular values of m.

The pseudo-inverse of matrix m with singular value decomposition m = u2 # mdiagonal(s2) # transpose(v2) is equal to Mpi = v2 # mdiagonal(Spi) # transpose(u2), where vector Spi is obtained from vector s2 by replacing each non-zero element by its reciprocal. The solution of the linear least-squares problem m # b = y is bs = Mpi # y, but going through the pseudoinverse is not the most efficient method of solving the least-squares problem.

The unit vector x that yields the smallest m # x is the right singular vector of m corresponding to the smallest singular value.

The null space of matrix m is spanned by the right singular vectors corresponding to vanishing singular values of m. The range of matrix m is spanned by the left singular vectors corresponding to non-zero singular values of m. The rank of matrix m equals the number of non-zero singular values.

The matrix Mr of rank r that best approximates matrix m (by minimizing the Frobenius norm of the difference) is Mr = u2 # mdiagonal(Sr) # transpose(v2) where Sr is the same as s2 but has all singular values beyond the first r set to zero.

The orthonormal matrix r closest to m (under the Frobenius norm) is u # transpose(v) – i.e., has all singular values replaced by 1.

The orthonormal matrix r which most closely maps a to b (for the orthogonal Procrustes problem) is the nearest orthonormal matrix to transpose(a) # b.


Previous: , Up: svd   [Contents][Index]