Scholarship & Research

Permanent URI for this communityhttps://scholarworks.montana.edu/handle/1/1

Browse

Search Results

Now showing 1 - 4 of 4
  • Thumbnail Image
    Item
    Modeling mass balance at Robertson Glacier, Alberta, Canada 1912-2012
    (Montana State University - Bozeman, College of Letters & Science, 2017) Scanlon, Ryan Scott; Chairperson, Graduate Committee: Mark L. Skidmore; Jordy Hendrikx (co-chair)
    Glacier mass balance is important to study due to the role of glaciers in the hydrological cycle. Glacier mass balance is typically difficult to measure without numerous in situ measurements and monitoring over the course of many years. Physically based melt models are a good tool for estimating melt using temperature, solar radiation, and albedo and are used extensively in this thesis. A Degree Day (DD) model and an Enhanced Temperature Index (ETI) model are used to model mass balance for Robertson Glacier, Alberta, Canada during the period 1912-2012. The DD model only incorporates temperature, while the ETI model incorporates temperature, incoming solar radiation, and albedo. Incoming solar radiation was modeled for the period 2007-2012 and parameterized for the period 1912-2006 while temperature was measured at the regional scale and synthesized for Robertson Glacier and the snowpack thickness was modeled using PRISM. The DD and ETI models both assume a static ice mass, i.e. no flow or change in ice elevation due to mass loss over the century time period. Both models estimate a high value of annual and accumulated mean mass loss for the period 1912-2012. Sensitivity analyses of model inputs indicate that snowpack is an important factor, and it appears PRISM estimates may underrepresent beginning of the year snowpack by 220% based on a comparison of modelled to measured values on the adjacent Haig Glacier. Avalanching is not a key component of accumulation on the Haig Glacier but is a key process at Robertson Glacier, and could result in locally doubling the snowpack accumulation in avalanche zones. These factors including the resultant albedo changes with a thicker snowpack are all part of a compounding negative feedback cycle on glacier mass loss. In summary, the thesis has highlighted several potential limitations to the ETI and DD models for assessing mass loss for a small mountain glacier in the southern Canadian Rockies and provides suggestions for future modelling work in this region.
  • Thumbnail Image
    Item
    On the usability of continuous time bayesian networks: improving scalability and expressiveness
    (Montana State University - Bozeman, College of Engineering, 2017) Perreault, Logan Jared; Chairperson, Graduate Committee: John Sheppard
    The Continuous Time Bayesian Network (CTBN) is a model capable of compactly representing the behavior of discrete state systems that evolve in continuous time. This is achieved by factoring a Continuous Time Markov Process using the structure of a directed graph. Although CTBNs have proven themselves useful in a variety of applications, adoption of the model for use in real-world problems can be difficult. We believe this is due in part to limitations relating to scalability as well as representational power and ease of use. This dissertation attempts to address these issues. First, we improve the expressiveness of CTBNs by providing procedures that support the representation of non-exponential parametric distributions. We also propose the Continuous Time Decision Network (CTDN) as a framework for representing decision problems using CTBNs. This new model supports optimization of a utility value as a function of a set of possible decisions. Next, we address the issue of scalability by providing two distinct methods for compactly representing CTBNs by taking advantage of similarities in the model parameters. These compact representations are able to mitigate the exponential growth in parameters that CTBNs exhibit, allowing for the representation of more complex processes. We then introduce another approach to managing CTBN model complexity by introducing the concept of disjunctive interaction for CTBNs. Disjunctive interaction has been used in Bayesian networks to provide significant reductions in the number of parameters, and we have adapted this concept to provide the same benefits within the CTBN framework. Finally, we demonstrate how CTBNs can be applied to the real-world task of system prognostics and diagnostics. We show how models can be built and parameterized directly using information that is readily available for diagnostic models. We then apply these model construction techniques to build a CTBN describing a vehicle system. The vehicle model makes use of some of the newly introduced algorithms and techniques, including the CTDN framework and disjunctive interaction. This extended application not only demonstrates the utility of the novel contributions presented in this work, but also serves as a template for applying CTBNs to other real-world problems.
  • Thumbnail Image
    Item
    Inference and learning in Bayesian networks using overlapping swarm intelligence
    (Montana State University - Bozeman, College of Engineering, 2015) Fortier, Nathan Lee; Chairperson, Graduate Committee: John Sheppard
    While Bayesian networks provide a useful tool for reasoning under uncertainty, learning the structure of these networks and performing inference over them is NP-Hard. We propose several heuristic algorithms to address the problems of inference, structure learning, and parameter estimation in Bayesian networks. The proposed algorithms are based on Overlapping Swarm intelligence, a modification of particle swarm optimization in which a problem is broken into overlapping subproblems and a swarm is assigned to each subproblem. The algorithm maintains a global solution that is used for fitness evaluation, and is updated periodically through a competition mechanism. We describe how the problems of inference, structure learning, and parameter estimation can be broken into subproblems, and provide communication and competition mechanisms that allow swarms to share information about learned solutions. We also present a distributed alternative to Overlapping Swarm Intelligence that does not require a global network for fitness evaluation. For the problems of full and partial abductive inference, a swarm is assigned to each relevant node in the network. Each swarm learns the relevant state assignments associated with the Markov blanket for its corresponding node. In our approach to parameter estimation, a swarm is associated with each node in the network that corresponds to either a latent variable or a child of a latent variable. Each node's corresponding swarm learns the parameters associated with that node's Markov blanket. We also apply Overlapping Swarm Intelligence to several variations of the structure learning problem: learning Bayesian classifiers, learning Bayesian networks with complete data, and learning Bayesian networks with latent variables. For each problem, a swarm is associated with each node in the network. This work makes a number of contributions relating to the advancement of Overlapping Swarm Intelligence as a general optimization technique. We demonstrate the applicability of Overlapping Swarm Intelligence to both discrete and continuous optimization problems. We also examine the effect of the swarm architecture and degree of overlap on algorithm performance. The experiments presented here demonstrate that, while the sub-swarm architecture affects algorithm performance, Overlapping Swarm Intelligence continues to perform well even when there is little overlap between the swarms.
  • Thumbnail Image
    Item
    Extensions to modeling and inference in continuous time Bayesian networks
    (Montana State University - Bozeman, College of Engineering, 2014) Sturlaugson, Liessman Eric; Chairperson, Graduate Committee: John Sheppard
    The continuous time Bayesian network (CTBN) enables reasoning about complex systems in continuous time by representing a system as a factored, finite-state, continuous-time Markov process. The dynamics of the CTBN are described by each node's conditional intensity matrices, determined by the states of the parents in the network. As the CTBN is a relatively new model, many extensions that have been defined with respect to Bayesian networks (BNs) have not yet been extended to CTBNs. This thesis presents five novel extensions to CTBN modeling and inference. First, we prove several complexity results specific to CTBNs. It is known that exact inference in CTBNs is NP-hard due to the use of a BN for the initial distribution. We prove that exact inference in CTBNs is still NP-hard, even when the initial states are given, and prove that approximate inference in CTBNs, as with BNs, is also NP-hard. Second, we formalize performance functions for the CTBN and show how they can be factored in the same way as the network, even when the performance functions are defined with respect to interaction between multiple nodes. Performance functions extend the model, allowing it to represent complex, user-specified functions of the behaviors of the system. Third, we present a novel method for node marginalization called "node isolation" that approximates a set of conditional intensity matrices with a single unconditional intensity matrix. The method outperforms previous node marginalization techniques in all of our experiments by better describing the long-term behavior of the marginalized nodes. Fourth, using the node isolation method we developed, we show how methods for sensitivity analysis of Markov processes can be applied to the CTBN while exploiting the conditional independence structure of the network. This enables efficient sensitivity analysis to be performed on our CTBN performance functions. Fifth, we formalize both uncertain and negative types of evidence in the context of CTBNs and extend existing inference algorithms to be able to support all combinations of evidence types. We show that these extensions make the CTBN more powerful, versatile, and applicable to real-world domains.
Copyright (c) 2002-2022, LYRASIS. All rights reserved.