- Journal Home
- Volume 36 - 2024
- Volume 35 - 2024
- Volume 34 - 2023
- Volume 33 - 2023
- Volume 32 - 2022
- Volume 31 - 2022
- Volume 30 - 2021
- Volume 29 - 2021
- Volume 28 - 2020
- Volume 27 - 2020
- Volume 26 - 2019
- Volume 25 - 2019
- Volume 24 - 2018
- Volume 23 - 2018
- Volume 22 - 2017
- Volume 21 - 2017
- Volume 20 - 2016
- Volume 19 - 2016
- Volume 18 - 2015
- Volume 17 - 2015
- Volume 16 - 2014
- Volume 15 - 2014
- Volume 14 - 2013
- Volume 13 - 2013
- Volume 12 - 2012
- Volume 11 - 2012
- Volume 10 - 2011
- Volume 9 - 2011
- Volume 8 - 2010
- Volume 7 - 2010
- Volume 6 - 2009
- Volume 5 - 2009
- Volume 4 - 2008
- Volume 3 - 2008
- Volume 2 - 2007
- Volume 1 - 2006
Commun. Comput. Phys., 35 (2024), pp. 761-815.
Published online: 2024-04
[An open-access article; the PDF is free to any online user.]
Cited by
- BibTex
- RIS
- TXT
Deep Ritz method is a deep learning paradigm to solve partial differential equations. In this article we study the generalization error of the Deep Ritz method. We focus on the quintessential problem which is the Poisson’s equation. We show that generalization error of the Deep Ritz method converges to zero with rate $\frac{C}{\sqrt{n}},$ and we discuss about the constant $C.$ Results are obtained for shallow and residual neural networks with smooth activation functions.
}, issn = {1991-7120}, doi = {https://doi.org/10.4208/cicp.OA-2023-0253}, url = {http://global-sci.org/intro/article_detail/cicp/23059.html} }Deep Ritz method is a deep learning paradigm to solve partial differential equations. In this article we study the generalization error of the Deep Ritz method. We focus on the quintessential problem which is the Poisson’s equation. We show that generalization error of the Deep Ritz method converges to zero with rate $\frac{C}{\sqrt{n}},$ and we discuss about the constant $C.$ Results are obtained for shallow and residual neural networks with smooth activation functions.