We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.
In this paper, we introduce and investigate the statistical mechanics of hierarchical
neural networks. First, we approach these systems à la Mattis, by
thinking of the Dyson model as a single-pattern hierarchical neural network.
We also discuss the stability of different retrievable states as predicted by the
related self-consistencies obtained both from a mean-field bound and from a
bound that bypasses the mean-field limitation. The latter is worked out by
properly reabsorbing the magnetization fluctuations related to higher levels of
the hierarchy into effective fields for the lower levels. Remarkably, mixing
Amitʼs ansatz technique for selecting candidate-retrievable states with the
interpolation procedure for solving for the free energy of these states, we prove
that, due to gauge symmetry, the Dyson model accomplishes both serial and
parallel processing. We extend this scenario to multiple stored patterns by
implementing the Hebb prescription for learning within the couplings. This
results in Hopfield-like networks constrained on a hierarchical topology, for
which, by restricting to the low-storage regime where the number of patterns
grows at its most logarithmical with the amount of neurons, we prove the
existence of the thermodynamic limit for the free energy, and we give an
explicit expression of its mean-field bound and of its related improved bound.
We studied the resulting self-consistencies for the Mattis magnetizations,
which act as order parameters, are studied and the stability of solutions is
analyzed to get a picture of the overall retrieval capabilities of the system
according to both mean-field and non-mean-field scenarios. Our main finding
is that embedding the Hebbian rule on a hierarchical topology allows the network to accomplish both serial and parallel processing. By tuning the level
of fast noise affecting it or triggering the decay of the interactions with the
distance among neurons, the system may switch from sequential retrieval to
multitasking features, and vice versa. However, since these multitasking
capabilities are basically due to the vanishing ‘dialogue’ between spins at long
distance, this effective penury of links strongly penalizes the networkʼs
capacity, with results bounded by low storage
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.