Abstract
We have studied the effect of various kinds of damaging that may occur in a neural network whose synaptic bonds have been trained (before damaging) so as to preserve a definite number of patterns. We have used the Hopfield model of the neural network, and applied the Hebbian rule of training (learning). We have studied networks with 600 elements (neurons) and investigated several types of damaging, by performing very extensive numerical investigation. Thus, we have demonstrated that there is no difference between symmetric and asymmetric damaging of bonds. Besides, it turns out that the worst damaging of synaptic bonds is the one that starts with ruining the strongest bonds, whereas in the opposite case, that is, in the case of damaging that starts with ruining the weakest bonds, the learnt patterns remain preserved even for a large percentage of extinguished bonds.
Original language | English |
---|---|
Pages (from-to) | 526-536 |
Number of pages | 11 |
Journal | Physica A: Statistical Mechanics and its Applications |
Volume | 295 |
Issue number | 3-4 |
Early online date | 22 May 2001 |
DOIs | |
Publication status | Published - 15 Jun 2001 |
Keywords
- Damaged neural networks
- Hopfield model
- Pattern recognition
- Ruined synaptic bonds
ASJC Scopus subject areas
- Mathematical Physics
- Statistical and Nonlinear Physics