Abstract

In this report, two applications of neural networks are investigated. The first one is low bit rate image compression by using neural networks and projection pursuit. The second one is improving the classification accuracy of neural network classifiers by using unlabeled data. In the first part, a novel approach for low bit rate image coding is presented. The image is compressed by first quadtree segmenting the image into blocks of different sizes based on two activity measures , and then constructing a distinct code for each block by invokin~gth e theory of projection pursuit. The two activity measures used in this work are the block variance and the signal to noise ratio (PSNR) of the reconstructed block. It is shown that the projection pursuit coding algorithm can adaptively conslruct a better approximation for each block until the desired signal to noise ratio or bit rate is achieved. This method also adaptively finds the optimum network configuration. Bxperimental values for the objective measure of performance using PSNR are superior to the JPEG decoded images. The subjective quality of the encoded images with the proposed algorithm are also superior to the JPEG encoded images. In the !second part, classification accuracy improvement of neural network classifiers using unlabled testing data is presented. In order to fully utilize the information contained in high dimensional data, training samples are needed from all classes. In order to increase classification accuracy without increasing the number of training samples, the network makes use of testing data along with training data for learning. However, the testing data are unlabeled whereas the training data are labeled. It was shown previously for the case of parametric classifiers that decision rules which use both labeled (training) and unlabeled (testing) samples have a lower expected error than those which use labeled samples only. Since the output of a neural network such as backpropagation network approximates the a posteriori probabilities, the same result applies to neural network classifiers. It is shown that including unlabeled samples from under-represented classes in the training set improves the classification accuracy of some of the classes during supervised-unsupervised learning.

Date of this Version

September 1996

Share

COinS