Theses and Dissertations at Montana State University (MSU)
Permanent URI for this communityhttps://scholarworks.montana.edu/handle/1/732
Browse
9 results
Search Results
Item Instructor usage of learning management systems utilizing a technology acceptance model(Montana State University - Bozeman, College of Education, Health & Human Development, 2017) Brown, Lisa Ann; Chairperson, Graduate Committee: Arthur W. BangertLimited research exists on the factors that influence an instructor's choice to use a learning management system. The purpose of the current study is to explore how task technology fit constructs relate to the other constructs that comprise Davis' Technology Acceptance mode. The technology acceptance model is widely used as an indicator of actual use of a technology system. A sample of 284 instructors completed a survey consisting of demographic questions, open ended questions about their reasons for choosing to choose to utilize a learning management system, and Likert scale questions about six constructs of the research model including task technology fit, ease of use, usefulness, attitude, intent to use, and actual use. The relationships between TAM model constructs and Task Technology Fit were analyzed using a partial least squares structural equation model method with SMART- PLS. The relationship between task technology fit and actual use was mediated by ease of use, usefulness, attitude, and intent to use. To evaluate the constructs in the model, an exploratory factor analysis was conducted and the factor structure for online and face-to-face instructors were different. Two models were developed, one for face-to-face instructors, and one for online instructors to account for this difference. The research models were evaluated for face-to-face instructors and online instructors. The study found significant relationships between all the TAM constructs and Task Technology Fit for face-to-face instructors. The relationship between attitude and intent to use was not significant for online instructors. This research supports the need for more research into the differences between online and face-to-face instructor's perceptions of technology use. The differing instructional needs of face-to-face and online instructors have implications on the training and support an institution should provide to increase usage of learning management systems.Item Particle imaging velocimetry data assimilation using least-square finite element methods(Montana State University - Bozeman, College of Engineering, 2016) Rajaraman, Prathish Kumar; Chairperson, Graduate Committee: Jeffrey Heys; T. A. Manteuffel, M. Belohlavek, E. McMahon and Jeffrey J. Heys were co-authors of the article, 'Echocardiograpic particle imaging velocimetry data assimilation with least square finite element methods' in the journal 'Computers & mathematics with applications' which is contained within this thesis.; G. D. Vo, G. Hansen and Jeffrey J. Heys were co-authors of the article, 'Comparison of continuous and discontinuous finite element methods for parabolic differential equations employing implicit time intergation' submitted to the journal 'International journal of numerical methods for heat & fluid flow' which is contained within this thesis.; T. A. Manteuffel, M. Belohlavek and Jeffrey J. Heys were co-authors of the article, 'Combining existing numerical models with data assimilation using weighted least-squares finite element methods' submitted to the journal 'International journal of numerical methods in biomedical engineering' which is contained within this thesis.Recent advancements in the field of echocardiography have introduced various methods to image blood flow in the heart. Of particular interest is the left ventricle of the heart, which pumps oxygenated blood from the lungs out through the aorta. One method for imaging blood flow is injecting FDA-approved micro-bubbles into the left ventricle, and then, using the motion of the microbubbles and the frame rate of the ultrasound scan, the blood velocity can be calculated. In addition to blood velocity, echocardiologists are also interested in calculating pressure gradients and other flow properties, but this is not currently possible because the velocity data obtained is two-dimensional and contains noise. In order to realize the full potential of microbubbles as a tool for determining the pumping efficiency and health of the LV, three-dimensional velocity data is required. Our goal is to assimilate two-dimensional velocity data from ultrasound experiments into a three-dimensional computer model. In order to achieve this objective, a numerical method is needed that can approximate the solution of a system of differential equations and assimilate an arbitrary number of noisy experimental values at arbitrary points within the domain of interests to provide a "most probable" approximate solution that is accordingly influenced by the experimental data. In this thesis we present two different approaches for data assimilation, the first approach is more computationally expensive, but requires only a single step. The second approach uses a two stage data assimilation technique but is computationally less expensive. The motivation for using the least-squares finite element method approach is that it provides many advantages such as the ability to match the numerical solution more closely to more accurate data and less closely to the less accurate data.Item Continuation approaches for nonlinear least squares(Montana State University - Bozeman, College of Engineering, 2015) Wold, Joshua Ivan; Chairperson, Graduate Committee: Steven R. ShawOutput error (OE) minimization is a commonly-used method for parameter estimation. It has the virtues that under reasonable conditions, the parameter estimates it produces are good (often maximum-likelihood) and only a simulation model is required to compute the predictions (making it equally useful for linear and nonlinear systems). Unfortunately, the procedure for computing the OE parameter estimates involves the minimization of a nonlinear cost function which requires the use of an iterative nonlinear least-squares (NLLS) algorithm such as Gauss- Newton. The present work begins with the observation that the one-step linear prediction error (PE) method - for which the parameter estimates can be computed via linear least-squares but suffers from noise-related bias - can be viewed as a particular transformation of the OE residuals. Further, that this transformation can be parameterized in such a way that a continuation of residual sequences is available that combines the properties of PE and OE. Minimizing a sequence of the cost functions associated with this continuation is shown to be a reliable method for finding the OE global minimum. It is later shown that the continuation need not begin with PE and that, in principle, any linear transformation of the OE residuals may be used provided that the resulting sequence of cost function global minima meet certain criteria. These criteria are provided in the form of a theorem which, in turn, makes use of the Implicit Function Theorem, an elementary result from analysis. The result is a method for guaranteeing that the continuation procedure has converged on the OE global minimum given a linear transformation has been chosen. No closed-form solution is given for the design of the transformation, but two designs are provided and shown to be successful on test problems.Item Nonparametric estimation of semivariogram functions(Montana State University - Bozeman, College of Letters & Science, 1994) Cherry, John StevenItem Numerical solution of nonlinear regressions under linear constraints(Montana State University - Bozeman, College of Agriculture, 1975) Yeh, Ning-ChiaItem Non-linear least squares estimation with an iteratively updated data set(Montana State University - Bozeman, College of Engineering, 2003) Keppler, Marc Leonard; Chairperson, Graduate Committee: Steven R. ShawItem An economic analysis of the impact of decoupled payments on farm solvency in the United States(Montana State University - Bozeman, College of Agriculture, 2014) Hasenoehrl, Amy Rae; Co-chairs, Graduate Committee: Eric Belasco and Anton BekkermanThis thesis evaluates the effects of decoupled agricultural support payments on the debt-to-asset ratio of farmers in the top five states producing corn, cotton, wheat and soybeans from 1996 to 2011. Building on existing literature, this study estimates the broader impacts of decoupled payments on farm solvency by considering all decoupled payments made since their establishment in 1996. A theoretical model of profit maximization identifies the factors predicted to influence solvency, which include farm assets, income, expenses, scale, production risk, decoupled payments and operator characteristics. Following the literature, the relationship between these factors and farm solvency are estimated empirically using a linear regression model with data from the Agricultural Resource Management Survey, Farm Service Agency and Risk Management Agency. The results indicate decoupled payments have a positive relationship with the debt-to-asset ratio and that the elimination of decoupled payments in the upcoming Farm Bill could lead to decreases in farmers' debt-to-asset ratios by an average of approximately ten percent. Furthermore, an analysis of the effects of decoupled payments by primary crop designation suggests that only corn soybean, corn and wheat farmers' debt-to-asset ratios are significantly responsive to changes in decoupled payments. This study also finds the effect of decoupled payments on solvency is uniform across farm size. In addition to these results, this thesis also contributes to current literature by providing preliminary evidence of an endogenous relationship between acres operated and the debt-to-asset ratio, which appears to introduce a positive bias on the parameter estimate for decoupled payments in the linear regression model. Furthermore, when a two-stage least squares model is used to control for this bias, the results estimate a negative relationship between decoupled payments and the debt-to-asset ratio. Due to the change in the coefficient of decoupled payments between the two models, this study suggests that results from research failing to account for a potential endogenous relationship between acres and the debt-to-asset ratio should be interpreted with caution.Item Weighted least-squares finite element methods for PIV data assimilation(Montana State University - Bozeman, College of Engineering, 2011) Wei, Fei; Chairperson, Graduate Committee: Jeffrey HeysThe ability to diagnose irregular flow patterns clinically in the left ventricle (LV) is currently very challenging. One potential approach for non-invasively measuring blood flow dynamics in the LV is particle image velocimetry (PIV) using microbubbles. To obtain local flow velocity vectors and velocity maps, PIV software calculates displacements of microbubbles over a given time interval, which is typically determined by the actual frame rate. In addition to the PIV, ultrasound images of the left ventricle can be used to determine the wall position as a function of time, and the inflow and outflow fluid velocity during the cardiac cycle. Despite the abundance of data, ultrasound and PIV alone are insufficient for calculating the flow properties of interest to clinicians. Specifically, the pressure gradient and total energy loss are of primary importance, but their calculation requires a full three-dimensional velocity field. Echo-PIV only provides 2D velocity data along a single plane within the LV. Further, numerous technical hurdles prevent three-dimensional ultrasound from having a sufficiently high frame rate (currently approximately 10 frames per second) for 3D PIV analysis. Beyond microbubble imaging in the left ventricle, there are a number of other settings where 2D velocity data is available using PIV, but a full 3D velocity field is desired. This thesis develops a novel methodology to assimilate two-dimensional PIV data into a three-dimensional Computational Fluid Dynamics simulation with moving domains. To illustrate and validate our approach, we tested the approach on three different problems: a flap displaced by a fluid jut; an expanding hemisphere; and an expanding half ellipsoid representing the left ventricle of the heart. To account for the changing shape of the domain in each problem, the CFD mesh was deformed using a pseudo-solid domain mapping technique at each time step. The incorporation of experimental PIV data can help to identify when the imposed boundary conditions are incorrect. This approach can also help to capture effects that are not modeled directly like the impacts of heart valves on the flow of blood into the left ventricle.Item Empirical least squares regression models for employment, in- and out-migration, and income distribution in the Northern Great Plains region of the United States(Montana State University - Bozeman, College of Agriculture, 1974) Lewis, Eugene Patrick, 1948-; Chairperson, Graduate Committee: Lloyd D. Bender.This research effort is aimed at determining empirical least squares regression models for employment, in- and out-migration, and income distribution, Secondary data is used exclusively; The observations are 181 non-metro counties in the Northern Great Plains Region of the United States. The statistical results show that all four models are directly linked to variations in the economic bases of counties. To some extent, this allows the models to be used concurrently in determining impacts. It was hypothesized and shown that the multiplier effect for employment varies with industry, scale of operation of the various industries, and location in economic space. This conslusion along with the successful inclusion of migration and income distribution suggests that the approach taken in this study is a possible alternative to strandard aggregate economic base and input-output studies.