Volume 38, Issue 4
Solving Systems of Quadratic Equations Via Exponential-Type Gradient Descent Algorithm

Meng Huang & Zhiqiang Xu

J. Comp. Math., 38 (2020), pp. 638-660.

Published online: 2020-04

Preview Purchase PDF 10 1187
Export citation
  • Abstract

We consider the rank minimization problem from quadratic measurements, i.e., recovering a rank $r$ matrix $X \in \mathbb{R}^{n×r}$ from $m$ scalar measurements $y_i=a_i^T XX^T a_i,\;a_i\in \mathbb{R}^n,\;i=1,\ldots,m$. Such problem arises in a variety of applications such as quadratic regression and quantum state tomography. We present a novel algorithm, which is termed {\em exponential-type gradient descent algorithm}, to minimize a non-convex objective function $f(U)=\frac{1}{4m}\sum_{i=1}^m(y_i-a_i^T UU^T a_i)^2$. This  algorithm  starts with a careful initialization, and then refines this initial guess by iteratively applying exponential-type gradient descent. Particularly,  we can obtain a good initial guess of $X$ as long as the number of Gaussian random measurements is  $O(nr)$, and  our iteration algorithm can converge linearly to the true $X$ (up to an orthogonal matrix) with $m=O\left(nr\log (cr)\right)$ Gaussian random measurements.

  • Keywords

Low-rank matrix recovery, Non-convex optimization, Phase retrieval.

  • AMS Subject Headings

90C26, 94A15

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address

hm@lsec.cc.ac.cn (Meng Huang)

xuzq@lsec.cc.ac.cn (Zhiqiang Xu)

  • BibTex
  • RIS
  • TXT
@Article{JCM-38-638, author = {Huang , Meng and Xu , Zhiqiang }, title = {Solving Systems of Quadratic Equations Via Exponential-Type Gradient Descent Algorithm}, journal = {Journal of Computational Mathematics}, year = {2020}, volume = {38}, number = {4}, pages = {638--660}, abstract = {

We consider the rank minimization problem from quadratic measurements, i.e., recovering a rank $r$ matrix $X \in \mathbb{R}^{n×r}$ from $m$ scalar measurements $y_i=a_i^T XX^T a_i,\;a_i\in \mathbb{R}^n,\;i=1,\ldots,m$. Such problem arises in a variety of applications such as quadratic regression and quantum state tomography. We present a novel algorithm, which is termed {\em exponential-type gradient descent algorithm}, to minimize a non-convex objective function $f(U)=\frac{1}{4m}\sum_{i=1}^m(y_i-a_i^T UU^T a_i)^2$. This  algorithm  starts with a careful initialization, and then refines this initial guess by iteratively applying exponential-type gradient descent. Particularly,  we can obtain a good initial guess of $X$ as long as the number of Gaussian random measurements is  $O(nr)$, and  our iteration algorithm can converge linearly to the true $X$ (up to an orthogonal matrix) with $m=O\left(nr\log (cr)\right)$ Gaussian random measurements.

}, issn = {1991-7139}, doi = {https://doi.org/10.4208/jcm.1902-m2018-0109}, url = {http://global-sci.org/intro/article_detail/jcm/16467.html} }
TY - JOUR T1 - Solving Systems of Quadratic Equations Via Exponential-Type Gradient Descent Algorithm AU - Huang , Meng AU - Xu , Zhiqiang JO - Journal of Computational Mathematics VL - 4 SP - 638 EP - 660 PY - 2020 DA - 2020/04 SN - 38 DO - http://doi.org/10.4208/jcm.1902-m2018-0109 UR - https://global-sci.org/intro/article_detail/jcm/16467.html KW - Low-rank matrix recovery, Non-convex optimization, Phase retrieval. AB -

We consider the rank minimization problem from quadratic measurements, i.e., recovering a rank $r$ matrix $X \in \mathbb{R}^{n×r}$ from $m$ scalar measurements $y_i=a_i^T XX^T a_i,\;a_i\in \mathbb{R}^n,\;i=1,\ldots,m$. Such problem arises in a variety of applications such as quadratic regression and quantum state tomography. We present a novel algorithm, which is termed {\em exponential-type gradient descent algorithm}, to minimize a non-convex objective function $f(U)=\frac{1}{4m}\sum_{i=1}^m(y_i-a_i^T UU^T a_i)^2$. This  algorithm  starts with a careful initialization, and then refines this initial guess by iteratively applying exponential-type gradient descent. Particularly,  we can obtain a good initial guess of $X$ as long as the number of Gaussian random measurements is  $O(nr)$, and  our iteration algorithm can converge linearly to the true $X$ (up to an orthogonal matrix) with $m=O\left(nr\log (cr)\right)$ Gaussian random measurements.

Meng Huang & Zhiqiang Xu. (2020). Solving Systems of Quadratic Equations Via Exponential-Type Gradient Descent Algorithm. Journal of Computational Mathematics. 38 (4). 638-660. doi:10.4208/jcm.1902-m2018-0109
Copy to clipboard
The citation has been copied to your clipboard