A correlation significance learning scheme for auto-associative memories.

D. L. Lee, W. J. Wang

Research output: Contribution to journalArticlepeer-review


A new concept called correlation significance for expanding the attraction regions around all the stored vectors (attractors) of an asynchronous auto-associative memory is introduced. Since the well known outer product rule adopts equally-weighted correlation matrix for the neuron connections, the attraction region around each attractor is not maximized. In order to maximize these attraction regions, we devise a rule that all the correlations between two different components of two different stored patterns should be unequally weighted. By this formalism, the connection matrix T of the asynchronous neural network is designed by using the gradient descent approach. Additionally, an exponential type error function is constructed such that the number of successfully stored vectors can be directly examined during the entire learning process. Finally, computer simulations demonstrate the efficiency and capability of this scheme.

Original languageEnglish
Pages (from-to)455-462
Number of pages8
JournalInternational journal of neural systems
Issue number4
StatePublished - Dec 1995


Dive into the research topics of 'A correlation significance learning scheme for auto-associative memories.'. Together they form a unique fingerprint.

Cite this