We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.
In this paper, inspired from our previous algorithm, which was
based on the theory of Tsallis statistical mechanics, we develop a
new evolving stochastic learning algorithm for neural networks.
The new algorithm combines deterministic and stochastic search
steps by employing a different adaptive stepsize for each network
weight, and applies a form of noise that is characterized by the
nonextensive entropic index q, regulated by a weight decay term.
The behavior of the learning algorithm can be made more stochastic
or deterministic depending on the trade off between the
temperature T and the q values. This is achieved by
introducing a formula that defines a time-dependent relationship
between these two important learning parameters. Our experimental
study verifies that there are indeed improvements in the
convergence speed of this new evolving stochastic learning
algorithm, which makes learning faster than using the original
Hybrid Learning Scheme (HLS). In addition, experiments are
conducted to explore the influence of the entropic index q and
temperature T on the convergence speed and stability of the
proposed method
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.