Abstract

Neural networks have a great place in approximating nonlinear functions, especially those Lebesgue integrable functions that are approximated by FNNs with one hidden layer and sigmoidal functions. Various operators of neural networks have been defined and achieved to get good rates of approximation depending on the modulus of smoothness. Here we define a new neural network operator with a generalized sigmoidal function (SoftMax) to improve the rate of approximation of a Lebesgue integrable function Lp , with p < 1, to be estimated using modulus of smoothness of order k. The importance of choosing SoftMax function as an activation function is its flexible properties and various applications.

Details

Title
SoftMax Neural Best Approximation
Author
Almurieb, Hawraa A 1 ; Bhaya, Eman S 1 

 Department of Mathematics, College of Education for Pure Sciences, University of Babylon, Iraq. 
Publication year
2020
Publication date
Jun 2020
Publisher
IOP Publishing
ISSN
17578981
e-ISSN
1757899X
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2562537451
Copyright
© 2020. This work is published under http://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.