arrow
Volume 14, Issue 4
Nonlinear Level Set Learning for Function Approximation on Sparse Data with Applications to Parametric Differential Equations

Anthony Gruber, Max Gunzburger, Lili Ju, Yuankai Teng & Zhu Wang

Numer. Math. Theor. Meth. Appl., 14 (2021), pp. 839-861.

Published online: 2021-09

Export citation
  • Abstract

A dimension reduction method based on the “Nonlinear Level set Learning” (NLL) approach is presented for the pointwise prediction of functions which have been sparsely sampled. Leveraging geometric information provided by the Implicit Function Theorem, the proposed algorithm effectively reduces the input dimension to the theoretical lower bound with minor accuracy loss, providing a one-dimensional representation of the function which can be used for regression and sensitivity analysis. Experiments and applications are presented which compare this modified NLL with the original NLL and the Active Subspaces (AS) method. While accommodating sparse input data, the proposed algorithm is shown to train quickly and provide a much more accurate and informative reduction than either AS or the original NLL on two example functions with high-dimensional domains, as well as two state-dependent quantities depending on the solutions to parametric differential equations.

  • AMS Subject Headings

65D15, 65D40

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{NMTMA-14-839, author = {Gruber , AnthonyGunzburger , MaxJu , LiliTeng , Yuankai and Wang , Zhu}, title = {Nonlinear Level Set Learning for Function Approximation on Sparse Data with Applications to Parametric Differential Equations}, journal = {Numerical Mathematics: Theory, Methods and Applications}, year = {2021}, volume = {14}, number = {4}, pages = {839--861}, abstract = {

A dimension reduction method based on the “Nonlinear Level set Learning” (NLL) approach is presented for the pointwise prediction of functions which have been sparsely sampled. Leveraging geometric information provided by the Implicit Function Theorem, the proposed algorithm effectively reduces the input dimension to the theoretical lower bound with minor accuracy loss, providing a one-dimensional representation of the function which can be used for regression and sensitivity analysis. Experiments and applications are presented which compare this modified NLL with the original NLL and the Active Subspaces (AS) method. While accommodating sparse input data, the proposed algorithm is shown to train quickly and provide a much more accurate and informative reduction than either AS or the original NLL on two example functions with high-dimensional domains, as well as two state-dependent quantities depending on the solutions to parametric differential equations.

}, issn = {2079-7338}, doi = {https://doi.org/10.4208/nmtma.OA-2021-0062}, url = {http://global-sci.org/intro/article_detail/nmtma/19521.html} }
TY - JOUR T1 - Nonlinear Level Set Learning for Function Approximation on Sparse Data with Applications to Parametric Differential Equations AU - Gruber , Anthony AU - Gunzburger , Max AU - Ju , Lili AU - Teng , Yuankai AU - Wang , Zhu JO - Numerical Mathematics: Theory, Methods and Applications VL - 4 SP - 839 EP - 861 PY - 2021 DA - 2021/09 SN - 14 DO - http://doi.org/10.4208/nmtma.OA-2021-0062 UR - https://global-sci.org/intro/article_detail/nmtma/19521.html KW - Nonlinear level set learning, function approximation, sparse data, nonlinear dimensionality reduction. AB -

A dimension reduction method based on the “Nonlinear Level set Learning” (NLL) approach is presented for the pointwise prediction of functions which have been sparsely sampled. Leveraging geometric information provided by the Implicit Function Theorem, the proposed algorithm effectively reduces the input dimension to the theoretical lower bound with minor accuracy loss, providing a one-dimensional representation of the function which can be used for regression and sensitivity analysis. Experiments and applications are presented which compare this modified NLL with the original NLL and the Active Subspaces (AS) method. While accommodating sparse input data, the proposed algorithm is shown to train quickly and provide a much more accurate and informative reduction than either AS or the original NLL on two example functions with high-dimensional domains, as well as two state-dependent quantities depending on the solutions to parametric differential equations.

Anthony Gruber, Max Gunzburger, Lili Ju, Yuankai Teng & Zhu Wang. (2021). Nonlinear Level Set Learning for Function Approximation on Sparse Data with Applications to Parametric Differential Equations. Numerical Mathematics: Theory, Methods and Applications. 14 (4). 839-861. doi:10.4208/nmtma.OA-2021-0062
Copy to clipboard
The citation has been copied to your clipboard