Neuro-Inspired Computing Enhanced by Scalable Algorithms and Physics of Emerging Nanoscale Resistive Devices
Abstract
Deep ‘Analog Artificial Neural Networks’ (AANNs) perform complex classification problems with high accuracy. However, they rely on humongous amount of power to perform the calculations, veiling the accuracy benefits. The biological brain on the other hand is significantly more powerful than such networks and consumes orders of magnitude less power, indicating some conceptual mismatch. Given that the biological neurons are locally connected, communicate using energy efficient trains of spikes, and the behavior is non-deterministic, incorporating these effects in Artificial Neural Networks (ANNs) may drive us few steps towards more realistic neural networks.Emerging devices can offer a plethora of benefits including power efficiency, faster operation, and low area, in a vast array of applications. For example, memristors and Magnetic Tunnel Junctions (MTJs) are suitable for high density, non-volatile Random Access Memories when compared with CMOS implementations. In this work, we analyze the possibility of harnessing the characteristics of such emerging devices, to achieve neuro-inspired solutions to intricate problems.We propose how the inherent stochasticity of nano-scale resistive devices can be utilized to realize the functionality of spiking neurons and synapses, that can be incorporated in deep stochastic Spiking Neural Networks (SNN) for image classification problems. While ANNs mainly dwell in the aforementioned classification problem solving domain, they can be adapted for a variety of other applications. One such neuro-inspired solution is the Cellular Neural Network (CeNN) based Boolean satisfiability solver. Boolean satisfiability (k-SAT) is an NP-complete (k ≥ 3) problem that constitute one of the hardest classes of constraint satisfaction problems. We provide a proof of concept hardware based analog k-SAT solver that is built using MTJs. The inherent physics of MTJs, enhanced by device level modifications, is harnessed here to emulate the intricate dynamics of an analog, CeNN based, satisfiability (SAT) solver.Furthermore, in the effort of reaching human level performance in terms of accuracy, increasing the complexity and size of ANNs is crucial. Efficient algorithms for evaluating neural network performance is of significant importance to improve the scalability of networks, in addition to designing hardware accelerators. We propose a scalable approach for evaluating Liquid State Machines: a bio-inspired computing model where the inputs are sparsely connected to a randomly interlinked reservoir (or liquid). It has been shown that biological neurons are more likely to be connected to other neurons in the close proximity, and tend to be disconnected as the neurons are spatially far apart. Inspired by this, we propose a group of locally connected neuron reservoirs, or an ensemble of liquids approach, for LSMs. We analyze how the segmentation of a single large liquid to create an ensemble of multiple smaller liquids affects the latency and accuracy of an LSM. Our results illustrate that the ensemble approach enhances class discrimination (quantified as the ratio between the Separation Property (SP) and Approximation Property (AP)), leading to improved accuracy in speech and image recognition tasks, when compared with a single large liquid. Furthermore, we obtain performance benefits in terms of improved inference time and reduced memory requirements, due to lower number of connections and the freedom to parallelize the liquid evaluation process.
Degree
Ph.D.
Advisors
Roy, Purdue University.
Subject Area
Artificial intelligence|Energy|Marketing
Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server.