Let b uld be an m by n matrix in diagonal zed form that is


Question: Let B = ULD be an M by N matrix in diagonal zed form; that is, L is an M by N diagonal matrix with entries λ1, ..., λK on its main diagonal, where K = min(M,N), and U and V are square matrices. Let the n-th column of U be denoted un and similarly for the columns of V. Such a diagonal decomposition occurs in the singular value decomposition (SVD). Show that we can write

B = λ1u1(v1) + . . . + λKuK(vK)

If B is an N by N Hermitian matrix, then we can take U = V and K = M = N, with the columns of U the eigenvectors of B, normalized to have Euclidean norm equal to one, and the λn to be the eigenvalues of B. In this case we may also assume that U is a unitary matrix; that is, UU = UU = I, where I denotes the identity matrix.

Solution Preview :

Prepared by a verified Expert
Mathematics: Let b uld be an m by n matrix in diagonal zed form that is
Reference No:- TGS02379903

Now Priced at $10 (50% Discount)

Recommended (91%)

Rated (4.3/5)