Theses and Dissertations at Montana State University (MSU)
Permanent URI for this communityhttps://scholarworks.montana.edu/handle/1/732
Browse
3 results
Search Results
Item Markov partitions and sofic codings for Anosov diffeomorphisms of nilmanifolds(Montana State University - Bozeman, College of Letters & Science, 2020) Fink, Eric Raymond; Chairperson, Graduate Committee: Jaroslaw KwapiszGiven an Anosov diffeomorphism of a compact manifold, the existence of a Markov partition and the associated conjugate symbolic dynamical system has been known for over fifty years by a celebrated result of Sinai, subsequently extended by Bowen. Building upon the work done by many authors in the context of hyperbolic toral automorphisms, we give an explicit arithmetic construction of sofic codings and Markov partitions for Anosov diffeomorphisms of nilmanifolds. Arising as quotients of nilpotent Lie groups by discrete and co-compact subgroups (lattices), nilmanifolds are conjecturally the only manifolds admitting Anosov diffeomorphisms, up to a finite covering.Item Extensions to modeling and inference in continuous time Bayesian networks(Montana State University - Bozeman, College of Engineering, 2014) Sturlaugson, Liessman Eric; Chairperson, Graduate Committee: John SheppardThe continuous time Bayesian network (CTBN) enables reasoning about complex systems in continuous time by representing a system as a factored, finite-state, continuous-time Markov process. The dynamics of the CTBN are described by each node's conditional intensity matrices, determined by the states of the parents in the network. As the CTBN is a relatively new model, many extensions that have been defined with respect to Bayesian networks (BNs) have not yet been extended to CTBNs. This thesis presents five novel extensions to CTBN modeling and inference. First, we prove several complexity results specific to CTBNs. It is known that exact inference in CTBNs is NP-hard due to the use of a BN for the initial distribution. We prove that exact inference in CTBNs is still NP-hard, even when the initial states are given, and prove that approximate inference in CTBNs, as with BNs, is also NP-hard. Second, we formalize performance functions for the CTBN and show how they can be factored in the same way as the network, even when the performance functions are defined with respect to interaction between multiple nodes. Performance functions extend the model, allowing it to represent complex, user-specified functions of the behaviors of the system. Third, we present a novel method for node marginalization called "node isolation" that approximates a set of conditional intensity matrices with a single unconditional intensity matrix. The method outperforms previous node marginalization techniques in all of our experiments by better describing the long-term behavior of the marginalized nodes. Fourth, using the node isolation method we developed, we show how methods for sensitivity analysis of Markov processes can be applied to the CTBN while exploiting the conditional independence structure of the network. This enables efficient sensitivity analysis to be performed on our CTBN performance functions. Fifth, we formalize both uncertain and negative types of evidence in the context of CTBNs and extend existing inference algorithms to be able to support all combinations of evidence types. We show that these extensions make the CTBN more powerful, versatile, and applicable to real-world domains.Item Parameters of Dynamic Markov Compression(Montana State University - Bozeman, College of Engineering, 1996) Orser, Gary D.