Theses and Dissertations at Montana State University (MSU)

Permanent URI for this collectionhttps://scholarworks.montana.edu/handle/1/733

Browse

Search Results

Now showing 1 - 10 of 42
  • Thumbnail Image
    Item
    RadPC@Scale: an approach to mitigate single event upsets in the memory of space computers
    (Montana State University - Bozeman, College of Engineering, 2022) Williams, Justin Patrick; Chairperson, Graduate Committee: Brock LaMeres
    This thesis presents the flight test results of a single event upset (SEU) mitigation strategy for computer data memory. This memory fault mitigation strategy is part of a larger effort to build a radiation tolerant computing system using commercial-off-the-shelf (COTS) field programmable gate arrays (FPGAs) called RadPC. While previous iterations of RadPC used FPGA block RAM (BRAM) for its data memory, the specific component of RadPC that is presented in this paper is a novel external memory scheme with accompanying systems that can detect, and correct faults that occur in the proposed data memory of the computer while allowing the computer to continue foreground operation. A prototype implementation of this memory protection scheme was flown on a Raven Aerostar Thunderhead high-altitude balloon system in July of 2021. This flight carried the experiment to an altitude of 75,000 feet for 50 hours allowing the memory in the experiment to be bombarded with ionizing radiation without being attenuated by the majority of Earth's atmosphere. This thesis discusses the details of the fault mitigation strategy, the design-of-experiments for the flight demonstration, and the results from the flight data. This thesis may be of interest to engineers that are designing flight computer systems that will be exposed to ionizing radiation and are looking for a lower cost SEU mitigation strategy compared to existing radiation- hardened solutions.
  • Thumbnail Image
    Item
    Identifying RR Lyrae variable stars in the NoirLab Source Catalog with template fitting
    (Montana State University - Bozeman, College of Letters & Science, 2022) Matt, Kyle Louis; Chairperson, Graduate Committee: David L. Nidever
    RR Lyrae are periodic variable stars generally with periods between 5 hours and 1 day. They can be used as standard candles for accurate distance measurements and thus are useful for studying the structure of the Milky Way and its stellar clusters. The second data release of the NoirLab Source Catalog is a large collection of 68 billion time-series measurements of 3.9 billion objects. To process this large volume of data, we designed a computer software package in Python called Leavitt to automate the detection process and measure their properties including period, magnitude, epoch of maximum brightness and amplitude of their pulsations by fitting their light curves to templates. In addition to identifying RR Lyrae, it is expected that Leavitt can be extended to identify similar variable stars such as Cepheids in the same dataset. Distances were calculated for the initial catalog of RR Lyrae candidates using parameters measured with this script.
  • Thumbnail Image
    Item
    Supporting data-intensive environmental science research: data science skills for scientific practitioners of statistics
    (Montana State University - Bozeman, College of Letters & Science, 2020) Theobold, Allison Shay; Chairperson, Graduate Committee: Stacey Hancock; Stacey Hancock was a co-author of the article, 'How environmental science graduate students acquire statistical computing skills' in the journal 'Statistics education research journal' which is contained within this dissertation.; Stacey Hancock and Sara Mannheimer were co-authors of the article, 'Designing data science workshops for data-intensive environmental science research' submitted to the journal 'Journal of statistics education ' which is contained within this dissertation.; Stacey Hancock was a co-author of the article, 'Data science skills in data-intensive environmental science research: the case of Alicia and Ellie' submitted to the journal 'Harvard data science review' which is contained within this dissertation.
    The importance of data science skills for modern environmental science research cannot be understated, but graduate students in these fields typically lack these integral skills. Yet, over the last 20 years statistics preparation in these fields has grown to be considered vital, and statistics coursework has been readily incorporated into graduate programs. As 'data science' is the study of extracting value from data, the field shares a great deal of conceptual overlap with the field of Statistics. Thus, many environmental science degree programs expect students to acquire these data science skills in an applied statistics course. A gap exists, however, between the data science skills required for students' participation in the entire data analysis cycle as applied to independent research, and those taught in statistics service courses. Over the last ten years, environmental science and statistics educators have outlined the shape of the data science skills specific to research in their respective disciplines. Disappointingly, however, both sides of these conversations have ignored the area at the intersection of these fields, specifically the data science skills necessary for environmental science practitioners of statistics. This research focuses on describing the nature of environmental science graduate students' need for data science skills when engaging in the data analysis cycle, through the voice of the students. In this work, we present three qualitative studies, each investigating a different aspect of this need. First, we present a study describing environmental science students' experiences acquiring the computing skills necessary to implement statistics in their research. In-depth interviews revealed three themes in these students' paths toward computational knowledge acquisition: use of peer support, seeking out a 'singular consultant,' and learning through independent research. Motivated by the need for extracurricular opportunities for acquiring data science skills, next we describe research investigating the design and implementation of a suite of data science workshops for environmental science graduate students. These workshops fill a critical hole in the environmental science and statistics curricula, providing students with the skills necessary to retrieve, view, wrangle, visualize, and analyze their data. Finally, we conclude with research that works toward identifying key data science skills necessary for environmental science graduate students as they engage in the data analysis cycle.
  • Thumbnail Image
    Item
    Machine learning pipeline for rare-event detection in synthetic-aperture radar and LIDAR data
    (Montana State University - Bozeman, College of Engineering, 2021) Scofield, Trey Palmer; Chairperson, Graduate Committee: Brad Whitaker
    In this work, we develop a machine learning pipeline to autonomously classify synthetic aperture radar (SAR) and lidar data in rare-event, remote sensing applications. Here, we are predicting the presence of volcanoes on the surface of Venus, fish in Yellowstone Lake, and select marine-life in the Gulf of Mexico. Given the efficiency of collecting SAR images in space and airborne lidar geographical surveys, the size of the datasets are immense. Immense training data is desirable for machine learning models; however, a large majority of the data we are using do not contain volcanoes or fish, respectively. Thus, the machine learning models must be formulated in such a way to place a high emphasis on the minority, target classes. The developed pipeline includes data preprocessing, unsupervised clustering, feature extraction, and classification. For each collection of data, sub-images are initially fed through the pipeline to capture fine detail characteristics until they are mapped back to their original image to identify overall region behavior and the location of the target class(es). For both sub-images and original images, results were quantified and the most effective algorithm combinations and parameters were assigned. In this analysis, we determined the classification results are not sufficient enough to propel a completely autonomous system, rather, some manual observing of the data will need to be performed. Nonetheless, the pipeline serves as an effective tool to reduce costs associated with electronic storage and transmission of the data, as well as human labor in manually inspecting the data. It does this by removing a majority of the unimportant, non-target data in some cases while successfully retaining a high percentage of the important images.
  • Thumbnail Image
    Item
    Snow avalanche identification using Sentinel-1: detection rates and controlling factors
    (Montana State University - Bozeman, College of Letters & Science, 2021) Keskinen, Zachary Marshall; Chairperson, Graduate Committee: Jordy Hendrikx; Jordy Hendrikx, Karl Birkeland and Markus Eckerstorfer were co-authors of the article, 'Snow avalanche identification using Sentinel-1 backscatter imagery: detection rates and controlling factors' submitted to the journal 'Natural hazards and Earth system sciences' which is contained within this thesis.
    Snow avalanches present a significant hazard that endangers lives and infrastructure. Consistent and accurate datasets of avalanche events is valuable for improving forecasting ability and furthering knowledge of avalanches' spatial and temporal patterns. Remote sensing-based techniques of identifying avalanche debris allow for continuous and spatially consistent datasets of avalanches to be acquired. This study utilizes expert manual interpretations of Sentinel-1 synthetic aperture radar (SAR) satellite backscatter images to identify avalanche debris and compares those detections against historical field records of avalanches in the transitional snow climates of Wyoming and Utah. This study explores the utility of Sentinel-1 (a SAR satellite) images to detect avalanche debris on primarily dry slab avalanches. The overall probability of detection (POD) rate for avalanches large enough to destroy trees or bury a car (i.e., D3 on the Destructive Size Scale) was 64.6%. There was a significant variance in the POD among the 13 individual SAR image pairs (15.4 - 87.0%). Additionally, this study investigated the connection between successful avalanche detections and SAR-specific, topographic, and avalanche type variables. The most correlated variables with higher detection rates were avalanche path lengths, destructive size of the avalanche, incidence angles for the incoming microwaves, slope angle, and elapsed time between the avalanche and a Sentinel-1 satellite passing over. This study provides an initial exploration of the controlling variables in the likelihood of detecting avalanches using Sentinel-1 backscatter change detection techniques. This study also supports the generalizability of SAR backscatter difference analysis by applying the methodology in different regions with distinct snow climates from previous studies.
  • Thumbnail Image
    Item
    Bayesian computing and sampling design for partially-surveyed spatial point process models
    (Montana State University - Bozeman, College of Letters & Science, 2020) Flagg, Kenneth Allen; Chairperson, Graduate Committee: Andrew Hoegh; Andrew Hoegh and John Borkowski were co-authors of the article, 'Modeling partially-surveyed point process data: inferring spatial point intensity of geomagnetic anomalies' in the journal 'Journal of agricultural, biological, and environmental statistics' which is contained within this dissertation.; Andrew Hoegh was a co-author of the article, 'The integrated nested laplace approximation applied to spatial log-Gaussian Cox process models' submitted to the journal 'Journal of applied statistics' which is contained within this dissertation.; John Borkowski and Andrew Hoegh were co-authors of the article, 'Log-Gaussian Cox processes and sampling paths: towards optimal design' submitted to the journal 'Spatial statistics' which is contained within this dissertation.
    Spatial point processes model situations such as unexploded ordnance, plant and animal populations, and celestial bodies, where events occur at distinct points in space. Point process models describe the number and distribution of these events. These models have been mathematically understood for many decades, but have not been widely used because of computational challenges. Computing advances in the last 30 years have kept interest alive, with several breakthroughs circa 2010 that have made Bayesian spatial point process models practical for many applications. There is now interest in sampling, where the process is only observed in part of the study site. My dissertation work deals with sampling along paths, a standard feature of unexploded ordnance remediation studies. In this dissertation, I introduce a data augmentation procedure to adapt a Dirichlet process mixture model to sampling situations and I provide the first comparison of a variety of sampling designs with regard to their spatial prediction performance for spatial log-Gaussian Cox process (LGCP) models. The Dirichlet process model remains computationally expensive in the sampling case while the LGCP performs well with low computing time. The sampling design study shows that paths with regular spacing perform well, with corners and direction changes being helpful when the path is short.
  • Thumbnail Image
    Item
    Scalable solutions to the carbon capture infrastructure problem
    (Montana State University - Bozeman, College of Engineering, 2020) Whitman, Caleb; Chairperson, Graduate Committee: Sean Yaw
    CO 2 capture and storage (CCS) is a climate change mitigation strategy that aims to reduce the amount of CO 2 vented into the atmosphere from industrial processes. Designing cost-effective CCS infrastructure is critical to meeting CO 2 emission reduction targets and is a computationally challenging problem. CCS infrastructure design is a generalization of the capacitated fixed charge network flow problem, CFCNF. CFCNF is NP-hard with no known approximation algorithms. In our work, we design three novel heuristics to solve CCS. We evaluate all heuristics on real life CCS infrastructure design data and find that they quickly generate solutions close to optimal. Decreasing the time it takes to determine CCS infrastructure designs will support national-level scenarios, undertaking risk and sensitivity assessments, and understanding the impact of government policies (e.g. 45Q tax credits for CCS).
  • Thumbnail Image
    Item
    Triplicated instruction set randomization in parallel heterogenous soft-core processors
    (Montana State University - Bozeman, College of Engineering, 2019) Gahl, Trevor James; Chairperson, Graduate Committee: Brock LaMeres
    Today's cyber landscape is as dangerous as ever, stemming from an ever increasing number of cybersecurity threats. A component of this danger comes from the execution of code-injection attacks that are hard to combat due to the monoculture environment fostered in today's society. One solution presented in the past, instruction set randomization, shows promise but requires large overhead both in timing and physical device space. To address this issue, a new processor architecture was developed to move instruction set randomization from software implementations to hardware. This new architecture consists of three functionally identical soft- core processors operating in parallel while utilizing individually generated random instruction sets. Successful hardware implementation and testing, using field programmable gate arrays, demonstrates the viability of the new architecture in small scale systems while also showing potential for expansion to larger systems.
  • Thumbnail Image
    Item
    Exploratory study on the effectiveness of type-level complexity metrics
    (Montana State University - Bozeman, College of Engineering, 2018) Smith, Killian; Chairperson, Graduate Committee: Clemente Izurieta
    The research presented in this thesis analyzes the feasibility of using information collected at the type level of object oriented software systems as a metric for software complexity, using the number of recorded faults as the response variable. In other words, we ask the question: Do popular industrial language type systems encode enough of the model logic to provide useful information about software quality? A longitudinal case study was performed on five open source Java projects of varying sizes and domains to obtain empirical evidence supporting the proposed type level metrics. It is shown that the type level metrics Unique Morphisms and Logic per Line of Code are more strongly correlated to the number of reported faults than the popular metrics Cyclomatic Complexity and Instability, and performed comparably to Afferent Coupling, Control per Line of Code, and Depth of Inheritance Tree. However, the type level metrics did not perform as well as Efferent Coupling. In addition to looking at metrics at single points in time, successive changes in metrics between software versions was analyzed. There was insufficient evidence to suggest that the metrics reviewed in this case study provided predictive capabilities in regards to the number of faults in the system. This work is an exploratory study; reducing the threats to external validity requires further research on a wider variety of domains and languages.
  • Thumbnail Image
    Item
    Development of a smart camera system using a system on module FPGA
    (Montana State University - Bozeman, College of Engineering, 2017) Dack, Connor Aquila; Chairperson, Graduate Committee: Ross K. Snider
    Imaging systems can now produce more data than conventional PCs with frame grabbers can process in real-time. Moving real-time custom computation as close as possible to the image sensor will alleviate the bandwidth bottle-neck of moving data multiple times through buffers in conventional PC systems, which are also computation bottlenecks. An example of a high bandwidth, high computation application is the use of hyperspectral imagers for sorting applications. Hyperspectral imagers capture hundreds of colors ranging from the visible spectrum to the infrared. This masters thesis continues the development of the hyperspectral smart camera by integrating the image sensor with a field programmable gate array (FPGA) and by developing an object tracking algorithm for use during the sorting process, with the goal of creating a single compact embedded solution. An FPGA is a hardware programmable integrated circuit that can be reprogrammed depending on the application. The prototype integration involves the development of a custom printed circuit board to connect the data and control lines between the sensor, the FPGA, and the control code to read data from the sensor. The hyperspectral data is processed on the FPGA and is combined with the object edges to make a decision on the quality of the object. The object edges are determined using a line scan camera, which provides data via the Camera Link interface, and a custom object tracking algorithm. The object tracking algorithm determines the horizontal edges and center of the object while also tracking the vertical edges and center of the object. The object information is then passed to the air jet sorting subsystem which ejects bad objects. The solution is to integrate the hyperspectral image sensor, the two processing algorithms, and Camera Link interface into a single, compact unit by implementing the design on the Intel Arria 10 System on Module with custom printed circuit boards.
Copyright (c) 2002-2022, LYRASIS. All rights reserved.