Operating Neural Networks on Mobile Devices

Peter Bai, Purdue University

Abstract

Machine learning is a rapidly developing field in computer research. Deep neural network architectures such as Resnet have allowed computers to process unstructured data such as images and videos with an extremely high degree of accuracy while at the same time managing to deliver those results with a reasonably low amount of latency. However, while deep neural networks are capable of achieving very impressive results, they are still very memory and computationally intensive, limiting their use to clusters with significant amounts of resources. This paper examines the possibility of running deep neural networks on mobile hardware, platforms with much more limited memory and computational bandwidth. We first examine the limitations of a mobile platform and what steps have to be taken to overcome those limitations in order to allow a deep neural network to operate on a mobile device with a reasonable level of performance. We then proceed into an examination of ApproxNet, a neural network designed to be run on mobile devices. ApproxNet provides a demonstration of how mobile hardware limits the performance of deep neural networks while also showing that these issues can be to an extent overcome, allowing a neural network to maintain usable levels of latency and accuracy

Degree

M.Sc.

Advisors

Bagchi, Purdue University.

Subject Area

Design|Artificial intelligence|Statistics

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS