Parallel, self-organizing hierarchical neural networks

Daesik Hong, Purdue University

Abstract

This thesis presents a new neural network architecture called the parallel self-organizing hierarchical neural network (PSHNN). The new architecture involves a number of stages in which each stage can be a particular neural network (SNN). At the end of each stage, error detection is carried out, and a number of input vectors are rejected. Between two stages there is a nonlinear transformation of those input vectors rejected by the previous stage. The new architecture has many desirable properties such as optimized system complexity in the sense of minimized self-organizing number of stages, high classification accuracy, minimized learning and recall times, and truly parallel architectures in which all stages are operating simultaneously without waiting for data from each other during testing. The experiments performed in comparison to multilayered networks with backpropagation training and indicated the superiority of the new architecture in the sense of classification accuracy, training time, parallelism, and robustness.

Degree

Ph.D.

Advisors

Ersoy, Purdue University.

Subject Area

Electrical engineering|Remote sensing

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS