Pattern recognition in damaged neural networks

Vladimir Miljković, Sava Milošević (Lead / Corresponding author), Rastko Sknepnek, Ivan Živić

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)


We have studied the effect of various kinds of damaging that may occur in a neural network whose synaptic bonds have been trained (before damaging) so as to preserve a definite number of patterns. We have used the Hopfield model of the neural network, and applied the Hebbian rule of training (learning). We have studied networks with 600 elements (neurons) and investigated several types of damaging, by performing very extensive numerical investigation. Thus, we have demonstrated that there is no difference between symmetric and asymmetric damaging of bonds. Besides, it turns out that the worst damaging of synaptic bonds is the one that starts with ruining the strongest bonds, whereas in the opposite case, that is, in the case of damaging that starts with ruining the weakest bonds, the learnt patterns remain preserved even for a large percentage of extinguished bonds.

Original languageEnglish
Pages (from-to)526-536
Number of pages11
JournalPhysica A: Statistical Mechanics and its Applications
Issue number3-4
Early online date22 May 2001
Publication statusPublished - 15 Jun 2001


  • Damaged neural networks
  • Hopfield model
  • Pattern recognition
  • Ruined synaptic bonds

ASJC Scopus subject areas

  • Mathematical Physics
  • Statistical and Nonlinear Physics


Dive into the research topics of 'Pattern recognition in damaged neural networks'. Together they form a unique fingerprint.

Cite this