Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/135769
Citations
Scopus Web of Science® Altmetric
?
?
Full metadata record
DC FieldValueLanguage
dc.contributor.authorDuan, L.-
dc.contributor.authorDuan, F.-
dc.contributor.authorChapeau-Blondeau, F.-
dc.contributor.authorAbbott, D.-
dc.date.issued2021-
dc.identifier.citationIEEE Transactions on Instrumentation and Measurement, 2021; 70:1010612-1-1010612-12-
dc.identifier.issn0018-9456-
dc.identifier.issn1557-9662-
dc.identifier.urihttps://hdl.handle.net/2440/135769-
dc.description.abstractAiming to ensure the feasibility of the back propagation training of feedforward threshold neural networks, each hidden unit layer is designed to be composed of a sufficiently large number of hard-limiting activation functions that are excited by mutually independent external noise components and the weighted inputs simultaneously. The application of noise to nondifferentiable activation functions enables a proper definition of the gradients, and the injected noise is treated as a network parameter that can be adaptively updated by a stochastic gradient descent learning rule. This noise-boosted back propagation learning process is found to converge to a nonzero optimized level of noise, indicating that the injected noise is beneficial both for the learning and for the ensuing retrieval phase. For minimizing the total error energy of the function approximation in the designed threshold neural network, the proposed noise-boosted backpropagation learning method is proven to be better than directly injecting noise into network inputs or weight coefficients. The Lipschitz continuous property of the noise-smoothed activation function in the hidden unit layer is demonstrated to guarantee the local convergence of the learning process. Beyond the Gaussian injected noise, the optimal noise type is also numerically solved for training the designed threshold neural network. Test experiments for approximating nonlinear functions and real-world datasets verify the feasibility of this noise-boosted backpropagation algorithm in the threshold neural network. These results not only extend the analysis of the beneficial effects of noise similar to stochastic resonance and exploited here to the universal approximation capabilities of threshold neural networks, but also allow back propagation training of neural networks with a much wider family of nondifferentiable activation functions.-
dc.description.statementofresponsibilityLingling Duan, Fabing Duan, François Chapeau-Blondeau, and Derek Abbott, Fellow, IEEE-
dc.language.isoen-
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)-
dc.rights© 2021 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See https://www.ieee.org/publications/rights/index.html for more information.-
dc.source.urihttp://dx.doi.org/10.1109/tim.2021.3121502-
dc.subjectFunction approximation; noise injection; noiseboosted backpropagation; optimal noise; stochastic resonance; threshold neural network-
dc.titleNoise-Boosted Backpropagation Learning of Feedforward Threshold Neural Networks for Function Approximation-
dc.typeJournal article-
dc.identifier.doi10.1109/TIM.2021.3121502-
dc.relation.granthttp://purl.org/au-research/grants/arc/DP200103795-
pubs.publication-statusPublished-
dc.identifier.orcidAbbott, D. [0000-0002-0945-2674]-
Appears in Collections:Electrical and Electronic Engineering publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.