Avian retinal photoreceptor detection and classification using Convolutional Neural Networks
Avian retinas contain special light filtering cones called photoreceptors. These photoreceptors help filter out specific wavelengths of light, giving birds a good range of distinction between colors. There are five distinct types of photoreceptors: red, yellow, transparent, colorless and principle. A specific photoreceptor can be identified by an organelle called an oil droplet. Dectecting and classifying the oil droplets is currently done by hand which can be a time consuming process. Using computer vision detecting and classifying the photoreceptors can be done automatically. The recent introduction of deep learning in computer vision has revolutionized automatic classification, producing classification results identical to what a human could do. Using deep learning the human element can be eliminated from oil droplet detection and classification. It can take days for a human to process and entire retina, but using deep learning a computer can perform the same take in a matter of minutes. In this work, using current state of the art deep learning frameworks we have created a Convolutional Neural Network that can classify the photoreceptors in the microscope images identical to human classified images. Coupling the CNN with various object detection algorithms, including R-CNN and two stage Hough transform, potential oil droplets can be both detected and classified automatically. Our results show that a CNN trained to classify the five different has human level accuracy. Detection algorithms still lag behind classification in accuracy with the best algorithms obtaining obtaining only 66\% on the PASCAL VOC 2012 data-set. Despite the limitations in detection, we show that adjusting the color contrast of the retina images accurate detection can be achieved for oil droplets regions.
Benes, Purdue University.
Ophthalmology|Developmental biology|Computer science
Off-Campus Purdue Users:
To access this dissertation, please log in to our