MyJournals Home  

RSS FeedsEntropy, Vol. 21, Pages 627: Smooth Function Approximation by Deep Neural Networks with General Activation Functions (Entropy)

 
 

27 june 2019 01:00:08

 
Entropy, Vol. 21, Pages 627: Smooth Function Approximation by Deep Neural Networks with General Activation Functions (Entropy)
 


There has been a growing interest in expressivity of deep neural networks. However, most of the existing work about this topic focuses only on the specific activation function such as ReLU or sigmoid. In this paper, we investigate the approximation ability of deep neural networks with a broad class of activation functions. This class of activation functions includes most of frequently used activation functions. We derive the required depth, width and sparsity of a deep neural network to approximate any Hölder smooth function upto a given approximation error for the large class of activation functions. Based on our approximation error analysis, we derive the minimax optimality of the deep neural network estimators with the general activation functions in both regression and classification problems.


 
76 viewsCategory: Informatics, Physics
 
Entropy, Vol. 21, Pages 629: A Parsimonious Granger Causality Formulation for Capturing Arbitrarily Long Multivariate Associations (Entropy)
Entropy, Vol. 21, Pages 635: Prediction of MoRFs in Protein Sequences with MLPs Based on Sequence Properties and Evolution Information (Entropy)
 
 
blog comments powered by Disqus


MyJournals.org
The latest issues of all your favorite science journals on one page

Username:
Password:

Register | Retrieve

Search:

Physics


Copyright © 2008 - 2024 Indigonet Services B.V.. Contact: Tim Hulsen. Read here our privacy notice.
Other websites of Indigonet Services B.V.: Nieuws Vacatures News Tweets Nachrichten