arrow
Volume 34, Issue 3
A Rate of Convergence of Weak Adversarial Neural Networks for the Second Order Parabolic PDEs

Yuling Jiao, Jerry Zhijian Yang, Cheng Yuan & Junyu Zhou

Commun. Comput. Phys., 34 (2023), pp. 813-836.

Published online: 2023-10

Export citation
  • Abstract

In this paper, we give the first rigorous error estimation of the Weak Adversarial Neural Networks (WAN) in solving the second order parabolic PDEs. By decomposing the error into approximation error and statistical error, we first show the weak solution can be approximated by the $ReLU^2$ with arbitrary accuracy, then prove that the statistical error can also be efficiently bounded by the Rademacher complexity of the network functions, which can be further bounded by some integral related with the covering numbers and pseudo-dimension of $ReLU^2$ space. Finally, by combining the two bounds, we prove that the error of the WAN method can be well controlled if the depth and width of the neural network as well as the sample numbers have been properly selected. Our result also reveals some kind of freedom in choosing sample numbers on $∂Ω$ and in the time axis.

  • AMS Subject Headings

62G05, 65N12, 65N15, 68T07

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{CiCP-34-813, author = {Jiao , YulingYang , Jerry ZhijianYuan , Cheng and Zhou , Junyu}, title = {A Rate of Convergence of Weak Adversarial Neural Networks for the Second Order Parabolic PDEs}, journal = {Communications in Computational Physics}, year = {2023}, volume = {34}, number = {3}, pages = {813--836}, abstract = {

In this paper, we give the first rigorous error estimation of the Weak Adversarial Neural Networks (WAN) in solving the second order parabolic PDEs. By decomposing the error into approximation error and statistical error, we first show the weak solution can be approximated by the $ReLU^2$ with arbitrary accuracy, then prove that the statistical error can also be efficiently bounded by the Rademacher complexity of the network functions, which can be further bounded by some integral related with the covering numbers and pseudo-dimension of $ReLU^2$ space. Finally, by combining the two bounds, we prove that the error of the WAN method can be well controlled if the depth and width of the neural network as well as the sample numbers have been properly selected. Our result also reveals some kind of freedom in choosing sample numbers on $∂Ω$ and in the time axis.

}, issn = {1991-7120}, doi = {https://doi.org/10.4208/cicp.OA-2023-0063}, url = {http://global-sci.org/intro/article_detail/cicp/22025.html} }
TY - JOUR T1 - A Rate of Convergence of Weak Adversarial Neural Networks for the Second Order Parabolic PDEs AU - Jiao , Yuling AU - Yang , Jerry Zhijian AU - Yuan , Cheng AU - Zhou , Junyu JO - Communications in Computational Physics VL - 3 SP - 813 EP - 836 PY - 2023 DA - 2023/10 SN - 34 DO - http://doi.org/10.4208/cicp.OA-2023-0063 UR - https://global-sci.org/intro/article_detail/cicp/22025.html KW - Weak Adversarial Networks, second order parabolic PDEs, error analysis. AB -

In this paper, we give the first rigorous error estimation of the Weak Adversarial Neural Networks (WAN) in solving the second order parabolic PDEs. By decomposing the error into approximation error and statistical error, we first show the weak solution can be approximated by the $ReLU^2$ with arbitrary accuracy, then prove that the statistical error can also be efficiently bounded by the Rademacher complexity of the network functions, which can be further bounded by some integral related with the covering numbers and pseudo-dimension of $ReLU^2$ space. Finally, by combining the two bounds, we prove that the error of the WAN method can be well controlled if the depth and width of the neural network as well as the sample numbers have been properly selected. Our result also reveals some kind of freedom in choosing sample numbers on $∂Ω$ and in the time axis.

Yuling Jiao, Jerry Zhijian Yang, Cheng Yuan & Junyu Zhou. (2023). A Rate of Convergence of Weak Adversarial Neural Networks for the Second Order Parabolic PDEs. Communications in Computational Physics. 34 (3). 813-836. doi:10.4208/cicp.OA-2023-0063
Copy to clipboard
The citation has been copied to your clipboard