Solving constrained optimization problems with neural networks

Walter Engevald Lillo, Purdue University

Abstract

The subject of this thesis is an application of artificial neural networks to solving linear and nonlinear programming problems. A new class of neural networks for solving constrained optimization problems is proposed. The proposed neural networks can be implemented utilizing common circuit elements. We analyze the dynamics of the proposed networks using the second method of Lyapunov and the penalty function approach. The implementation issues are discussed and a successful circuit implementation of the proposed neural network for the case of quadratic programming problem is presented. In addition, we discuss applications of the proposed neural architectures to solving a class of minimum norm problems. In particular, the specific cases of minimizing the l$\sb1$, l$\sb2$, and l$\sb\infty$ norms of a vector subject to a set of equality constraints are examined and implementations are given corresponding to these problems. Finally the dynamics of the generalized BSB model are examined for the case where the weight matrix is symmetric. It is shown that this model works to minimize an energy function over an n-dimensional hypercube.

Degree

Ph.D.

Advisors

Zak, Purdue University.

Subject Area

Electrical engineering|Artificial intelligence

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS