Large scale neural associative memory design and its applications
The human brain has a remarkable capability to recall information if a sufficient clue is presented. This is known as content based recall or associative memory, where the information is accessed by content rather than by address. In this thesis, a memory is a device in which information can be inserted and stored and from which it may be extracted when needed. It is called a neural associative memory when it is constructed using a dynamical system called an artificial neural network. A difficulty with neural associative memory design is the quadratic growth of the number of interconnections with the size of the pattern to be stored. On the other hand, the recall performance of associative memories deteriorates as the size of the neural networks is reduced. To overcome the above described problems, the generalized Brain-State-in-a-Box (gBSB) neural network based associative memories are developed that can process large scale patterns efficiently, and can be applied to the systems that process images. In particular, interconnected, gBSB neural network based, neural associative memory architectures are proposed. The method to determine the interconnection parameters based on an evolution strategy is devised. Next, a large scale associative memory is developed using a pattern decomposition concept. An image storage and retrieval system using large scale associative memory based on gBSB neural networks is constructed as an application. Finally, a neural associative memory that stores and retrieves pattern sequences is developed and a large scale associative memory is constructed employing this neural network.
Zak, Purdue University.
Off-Campus Purdue Users:
To access this dissertation, please log in to our