# Generalized singular value decomposition

In linear algebra, the generalized singular value decomposition (GSVD) is the name of two different techniques based on the singular value decomposition. The two versions differ because one version decomposes two (or more) matrices (much like higher order PCA) and the other version uses a set of constraints imposed on the left and right singular vectors.

## Higher order version

The generalized singular value decomposition (GSVD) is a matrix decomposition more general than the singular value decomposition. It was introduced by Van Loan  in 1976 and later developed by Paige and Saunders. The SVD and the GSVD, as well as some other possible generalizations of the SVD  , are extensively used in the study of the conditioning and regularization of linear systems with respect to quadratic semi-norms

Let $\mathbb {F} =\mathbb {R}$ , or $\mathbb {F} =\mathbb {C}$ . Given matrices $A\in \mathbb {F} ^{m\times n}$ and $B\in \mathbb {F} ^{p\times n}$ , their GSVD is given by 

$A=U\Sigma _{1}[X,0]Q^{*}$ and

$B=V\Sigma _{2}[X,0]Q^{*}$ where $U\in \mathbb {F} ^{m\times m},V\in \mathbb {F} ^{p\times p}$ , and $Q\in \mathbb {F} ^{n\times n}$ are unitary matrices, and $X\in \mathbb {F} ^{r\times r}$ is non-singular, where $r=rank([A^{*},B^{*}])$ . Also, $\Sigma _{1}\in \mathbb {F} ^{m\times r}$ is non-negative diagonal, and $\Sigma _{2}\in \mathbb {F} ^{p\times r}$ is non-negative block-diagonal, with diagonal blocks; $\Sigma _{2}$ is not always diagonal. It holds that $\Sigma _{1}^{T}\Sigma _{1}=\lceil \alpha _{1}^{2},\dots ,\alpha _{r}^{2}\rfloor$ and $\Sigma _{2}^{T}\Sigma _{2}=\lceil \beta _{1}^{2},\dots ,\beta _{r}^{2}\rfloor$ , and that $\Sigma _{1}^{T}\Sigma _{1}+\Sigma _{2}^{T}\Sigma _{2}=I_{r}$ . This implies $0\leq \alpha _{i},\beta _{i}\leq 1$ . The ratios $\sigma _{i}=\alpha _{i}/\beta _{i}$ are called the generalized singular values of $A$ and $B$ . If $B$ is square and invertible, then the generalized singular values are the singular values, and $U$ and $V$ are the matrices of singular vectors, of the matrix $AB^{-1}$ . Further, if $B=I$ , then the GSVD reduces to the singular value decomposition, explaining the name.

## Weighted version

The weighted version of the generalized singular value decomposition (GSVD) is a constrained matrix decomposition with constraints imposed on the left and right singular vectors of the singular value decomposition. This form of the GSVD is an extension of the SVD as such. Given the SVD of an m×n real or complex matrix M

$M=U\Sigma V^{*}\,$ where

$U^{*}W_{u}U=V^{*}W_{v}V=I.$ Where I is the identity matrix and where $U$ and $V$ are orthonormal given their constraints ($W_{u}$ and $W_{v}$ ). Additionally, $W_{u}$ and $W_{v}$ are positive definite matrices (often diagonal matrices of weights). This form of the GSVD is the core of certain techniques, such as generalized principal component analysis and Correspondence analysis.

The weighted form of the GSVD is called as such because, with the correct selection of weights, it generalizes many techniques (such as multidimensional scaling and linear discriminant analysis)

## Applications

The GSVD, formulated as a comparative spectral decomposition, has been successfully applied to signal processing and data science, e.g., in genomic signal processing.

These applications inspired several additional comparative spectral decompositions, i.e., the higher-order GSVD (HO GSVD) and the tensor GSVD.