Parallel probabilistic self-organizing hierarchical neural networks

Faramarz Valafar, Purdue University

Abstract

A new neural network architecture called the Parallel Probabilistic Self-Organizing Hierarchical Neural Network (PPSHNN) is introduced. The PPSHNN is designed to solve complex classification problems, by dividing the input vector space into regions, and by performing classification on those regions. It consists of several modules which operate in a hierarchically during learning and in parallel during testing. Each module has the task of classification for a region of the input information space as well as the task of participating in the formation of these regions through post- and pre-rejection schemes. The decomposition into regions is performed in a manner that makes classification easier on each of the regions. The post-rejector submodule performs a bitwise statistical analysis and detection of hard to classify vectors. The pre-rejector module accepts only those classes for which the module is trained and rejects others. The PNS module is developed as a variation of the PPSHNN module. If delta rule networks are used to build the submodules of PNS, then it uses piecewise linear boundaries to divide the problem space into regions. The PNS module has a high classification accuracy while it remains relatively inexpensive. The submodules of PNS are fractile in nature, meaning that each such unit may itself consist of a number of PNS modules. The PNS module is discussed as the building block for the synthesis of PPSHNN. The SIMD version of PPSHNN is implemented on MASPAR with 16k processors. On all the experiments performed, this network has outperformed the previously used networks in terms of accuracy of classification and speed.

Degree

Ph.D.

Advisors

Ersoy, Purdue University.

Subject Area

Electrical engineering|Biomedical research|Neurology|Artificial intelligence

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS