Show simple item record

dc.contributor.advisorChairperson, Graduate Committee: John Shepparden
dc.contributor.authorSenecal, Jacob Johnen
dc.date.accessioned2021-06-09T18:47:45Z
dc.date.available2021-06-09T18:47:45Z
dc.date.issued2019en
dc.identifier.urihttps://scholarworks.montana.edu/xmlui/handle/1/16201
dc.description.abstractWhile a great deal of research has been directed towards developing neural network architectures for classifying RGB images, there is a relative dearth of research directed towards developing neural network architectures specifically for multi-spectral and hyper-spectral imagery. The additional spectral information contained in a multi-spectral or hyper-spectral image can be valuable for land management, agriculture and forestry, disaster control, humanitarian relief operations, and environmental monitoring. However, the massive amounts of data generated by a multi-spectral or hyper- spectral instrument make processing this data a challenge. Machine learning and computer vision techniques could automate the analysis process of these rich data sources. With these benefits in mind, we have adapted recent developments in small efficient convolutional neural networks (CNNs), to create a small CNN architecture capable of being trained from scratch to classify 10 band multi-spectral images, using much fewer parameters than popular deep architectures, such as the ResNet or DenseNet architectures. We show that this network provides higher classification accuracy and greater sample efficiency than the same network using RGB images. We also show that it is possible to employ a transfer learning approach and use a network pre-trained on multi-spectral satellite imagery to increase accuracy on a second much smaller multi-spectral dataset, even though the satellite imagery was captured from a much different perspective (high altitude, overhead vs. ground based at close stand-off distance). These results demonstrates that it is possible to train our small network architectures on small multi-spectral datasets and still achieve high classification accuracy. This is significant as labeled hyper-spectral and multi-spectral datasets are generally much smaller than their RGB counterparts. Finally, we approximate a Bayesian version of our CNN architecture using a recent technique known as Monte Carlo dropout. By keeping dropout in place during test time we can perform a Monte Carlo procedure using multiple forward passes of our network to generate a distribution of network outputs which can be used as a measure of uncertainty in the predictions a network is making. Large variance in the network output corresponds to high uncertainty and vice versa. We show that a network that is capable of working with multi-spectral imagery significantly reduces the uncertainty associated with class predictions compared to using RGB images. This analysis reveals that the benefits of an architecture that works effectively with multi-spectral or hyper-spectral imagery extends beyond higher classification accuracy. Multi-spectral and hyper-spectral imagery allows us to be more confident in the predictions that a deep neural network is making.en
dc.language.isoenen
dc.publisherMontana State University - Bozeman, Norm Asbjornson College of Engineeringen
dc.subject.lcshOptical spectroscopyen
dc.subject.lcshPhotographyen
dc.subject.lcshClassificationen
dc.subject.lcshMachine learningen
dc.subject.lcshNeural networks (Computer science)en
dc.subject.lcshUncertaintyen
dc.titleConvolutional neural networks for multi- and hyper-spectral image classificationen
dc.typeThesisen
dc.rights.holderCopyright 2019 by Jacob John Senecalen
thesis.degree.committeemembersMembers, Graduate Committee: Joseph A. Shaw; David Millmanen
thesis.degree.departmentGianforte School of Computing.en
thesis.degree.genreThesisen
thesis.degree.nameMSen
thesis.format.extentfirstpage1en
thesis.format.extentlastpage106en
mus.data.thumbpage21en


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


MSU uses DSpace software, copyright © 2002-2017  Duraspace. For library collections that are not accessible, we are committed to providing reasonable accommodations and timely access to users with disabilities. For assistance, please submit an accessibility request for library material.