Theses and Dissertations at Montana State University (MSU)
Permanent URI for this collectionhttps://scholarworks.montana.edu/handle/1/733
Browse
6 results
Search Results
Item Bayesian computing and sampling design for partially-surveyed spatial point process models(Montana State University - Bozeman, College of Letters & Science, 2020) Flagg, Kenneth Allen; Chairperson, Graduate Committee: Andrew Hoegh; Andrew Hoegh and John Borkowski were co-authors of the article, 'Modeling partially-surveyed point process data: inferring spatial point intensity of geomagnetic anomalies' in the journal 'Journal of agricultural, biological, and environmental statistics' which is contained within this dissertation.; Andrew Hoegh was a co-author of the article, 'The integrated nested laplace approximation applied to spatial log-Gaussian Cox process models' submitted to the journal 'Journal of applied statistics' which is contained within this dissertation.; John Borkowski and Andrew Hoegh were co-authors of the article, 'Log-Gaussian Cox processes and sampling paths: towards optimal design' submitted to the journal 'Spatial statistics' which is contained within this dissertation.Spatial point processes model situations such as unexploded ordnance, plant and animal populations, and celestial bodies, where events occur at distinct points in space. Point process models describe the number and distribution of these events. These models have been mathematically understood for many decades, but have not been widely used because of computational challenges. Computing advances in the last 30 years have kept interest alive, with several breakthroughs circa 2010 that have made Bayesian spatial point process models practical for many applications. There is now interest in sampling, where the process is only observed in part of the study site. My dissertation work deals with sampling along paths, a standard feature of unexploded ordnance remediation studies. In this dissertation, I introduce a data augmentation procedure to adapt a Dirichlet process mixture model to sampling situations and I provide the first comparison of a variety of sampling designs with regard to their spatial prediction performance for spatial log-Gaussian Cox process (LGCP) models. The Dirichlet process model remains computationally expensive in the sampling case while the LGCP performs well with low computing time. The sampling design study shows that paths with regular spacing perform well, with corners and direction changes being helpful when the path is short.Item Intrusive uncertainty quantification method for simulations of gas-liquid multiphase flows(Montana State University - Bozeman, College of Engineering, 2020) Turnquist, Brian Robert; Chairperson, Graduate Committee: Mark Owkes; Mark Owkes was a co-author of the article, 'MULTIUQ: an intrusive uncertainty quantification tool for gas-liquid multiphase flows' in the journal 'Journal of computational physics' which is contained within this dissertation.; Mark Owkes was a co-author of the article, 'A fast, density decoupled pressure solver for an intrusive stochastic multiphase flow solver' submitted to the journal 'Journal of computational physics' which is contained within this dissertation.; Mark Owkes was a co-author of the article, 'MULTIUQ: a software package for uncertainty quantification of multiphase flows' submitted to the journal 'Computer physics communications' which is contained within this dissertation.; Mark Owkes was a co-author of the article, 'Exploration of basis functions for projecting a stochastic level set in a multiphase flow solver' submitted to the journal 'Atomization and sprays' which is contained within this dissertation.Simulations of fluid dynamics play an increasingly important role in the development of new technology. For example, engineers may need to simulate an atomizing jet to create a better direct injection system for improving fuel economy in a vehicle, or to more efficiently spray water for building fire mitigation systems. The increased use of computational fluid dynamics requires improvements in methodology to improve simulation efficiency and accuracy. We can extract a great deal from these models, including uncertainty information. Although simulation of gas-liquid multiphase flow scenarios are common, most are deterministic in nature. Model parameters, like fluid density or viscosity, are assumed to be known and fixed. But this is not usually the case, and a research gap exists for uncertainty analysis in these simulations. For efficient performance, an intrusive approach is used to create a multiphase solver capable of uncertainty analysis. Variables of interest, such as velocity and pressure, are converted into stochastic variables which are allowed to vary in an added uncertainty dimension. Variability is then added to fluid parameters or initial/boundary conditions and a simulation is run which produces stochastic results. To verify the solver, several cases are presented which compare the ability of the solver against analytic solutions. Once satisfied with the ability of the solver, we can answer questions about more complex scenarios. For instance, we may question how uncertainty about the surface tension force may affect the atomization of a jet and find that fluids with a lower surface tension coefficient breakup sooner (as expected). We could also consider scenarios that may not have such an obvious outcome, such as imposing uncertainty about the density ratio for an atomizing jet to determine the effect of running simulations at low vs high density ratios. multiUQ is capable of producing accurate results of real world situations. As a tool it can provide additional insight into understanding complicated multiphase flow systems.Item Efficient machine learning using partitioned restricted Boltzmann machines(Montana State University - Bozeman, College of Engineering, 2016) Tosun, Hasari; Chairperson, Graduate Committee: John SheppardRestricted Boltzmann Machines (RBM) are energy-based models that are used as generative learning models as well as crucial components of Deep Belief Networks (DBN). The most successful training method to date for RBMs is Contrastive Divergence. However, Contrastive Divergence is inefficient when the number of features is very high and the mixing rate of the Gibbs chain is slow. We develop a new training method that partitions a single RBM into multiple overlapping atomic RBMs. Each partition (RBM) is trained on a section of the input vector. Because it is partitioned into smaller RBMs, all available data can be used for training, and individual RBMs can be trained in parallel. Moreover, as the number of dimensions increases, the number of partitions can be increased to reduce runtime computational resource requirements significantly. All other recently developed methods for training RBMs suffer from some serious disadvantage under bounded computational resources; one is forced to either use a subsample of the whole data, run fewer iterations (early stop criterion), or both. Our Partitioned-RBM method provides an innovative scheme to overcome this shortcoming. By analyzing the role of spatial locality in Deep Belief Networks (DBN), we show that spatially local information becomes diffused as the network becomes deeper. We demonstrate that deep learning based on partitioning of Restricted Boltzmann Machines (RBMs) is capable of retaining spatially local information. As a result, in addition to computational improvement, reconstruction and classification accuracy of the model is also improved using our Partitioned-RBM training method.Item Economic analysis of farm firm growth in a semi-arid area of Montana(Montana State University - Bozeman, College of Agriculture, 1972) Larson, Donald KeithItem Stochastic simulation model for the Hawaiian monk seal(Montana State University - Bozeman, College of Letters & Science, 2002) Harting, Albert LivelyItem Achieving self-managed deployment in a distributed environment via utility functions(Montana State University - Bozeman, College of Engineering, 2008) Deb, Debzani; Chairperson, Graduate Committee: John Paxton; Michael Oudshoorn (co-chair)This dissertation presents algorithms and mechanisms that enable self-managed, scalable and efficient deployment of large-scale scientific and engineering applications in a highly dynamic and unpredictable distributed environment. Typically these applications are composed of a large number of distributed components and it is important to meet the computational power and network bandwidth requirements of those components and their interactions. However satisfying these requirements in a large-scale, shared, heterogeneous, and highly dynamic distributed environment is a significant challenge. As systems and applications grow in scale and complexity, attaining the desired level of performance in this uncertain environment using current approaches based on global knowledge, centralized scheduling and manual reallocation becomes infeasible. This dissertation focuses on the modeling of the application and underlying architecture into a common abstraction and on the incorporations of autonomic features into those abstractions to achieve self-managed deployment. In particular, we developed techniques for automatically identifying application components and their estimated resource requirements within an application and used them in order to model the application into a graph abstraction. We also developed techniques that allow the distributed resources to self-organize in a utility-aware way while assuming minimal knowledge about the system. Finally, to achieve self-managed deployment of application components to the distributed nodes, we designed a scalable and adaptive scheduling algorithm which is governed by a utility function. The utility function, which combines several application and system level attributes, governs both the initial deployment of the application components and their reconfigurations despite the dynamism and uncertainty associated with the computing environment. The experimental results show that it is possible to achieve and maintain efficient deployment by applying the utility function derived in this paper based solely on locally available information and without costly global communication or synchronization. The self-management is therefore decentralized and provides better adaptability, scalability and robustness.