Theses and Dissertations at Montana State University (MSU)

Permanent URI for this collectionhttps://scholarworks.montana.edu/handle/1/733

Browse

Search Results

Now showing 1 - 10 of 19
  • Thumbnail Image
    Item
    Evaluating alfalfa weevil (Hypera postica) resistance to mode of action group 3A pyrethroid insecticides in the western United States
    (Montana State University - Bozeman, College of Agriculture, 2023) Rodbell, Erika Adriana; Chairperson, Graduate Committee: Kevin Wanner; This is a manuscript style paper that includes co-authored chapters.
    Alfalfa weevil (Hypera postica Gellenhal [Coleoptera: Curculionidae]) is an insect pest of forage alfalfa (Medicago sativa L. [Fabales: Fabaceae]) in the western United States. Over the last half-century, insecticides have been the primary control tactic used by alfalfa producers. However, in 2015 numerous reports of pyrethroid insecticide (mode of action (MoA) 3A) failure to control alfalfa weevil populations were made. In 2019, Montana producers were reporting the same failures in their production systems. Therefore, research efforts in the Wanner Lab commenced in 2020 with the exclusive research goal of identifying pyrethroid resistant alfalfa weevil populations in the western United States. The focus of the research is four-fold. The first was to identify alfalfa weevil lambda-cyhalothrin resistance and susceptibility in Montana. The second was to identify lambda-cyhalothrin resistance and susceptibility in Arizona, California, Montana, Oregon, Washington, and Wyoming. The third was to identify if resistance to lambdacyhalothrin resulted in the loss of efficacy of other MoA 3A active ingredients. The fourth was to develop a case study addressing integrated resistance management recommendations for alfalfa weevil pyrethroid resistance mitigation. We conducted our research through contact bioassays, molecular genomics, and field trials, to corroborate our results and to identify if alfalfa weevil strain was a factor influencing our documented pattern of resistance. Cumulatively, our results suggest that alfalfa weevil lambda-cyhalothrin resistance is present in Arizona, California, Montana, Oregon, Washington, and Wyoming, and that susceptible populations remain in the western region. Our data further illustrate that regardless of alfalfa weevil strain, alfalfa weevils resistant to lambda-cyhalothrin will be resistant to other type II pyrethroid active ingredients and permethrin. A pattern seen in three distinct alfalfa production zones in the western United States (i.e., Arizona, Montana, and Washington), determined by both contact bioassays and field trials. In conclusion, our results illustrate a challenge that forage alfalfa production faces in the western United States and provides strategies that western forage alfalfa producers can employ to mitigate pyrethroid resistance from developing.
  • Thumbnail Image
    Item
    Importance of foot and leg structure for beef cattle in forage-based production systems: characterizing foot and leg scores for Montana Angus cattle
    (Montana State University - Bozeman, College of Agriculture, 2023) Sitz, Taylre Elizebeth; Chairperson, Graduate Committee: Timothy DelCurto
    The objectives of this study were to increase the amount of phenotypic data available for enhancing the foot and claw EPDs of Montana sires and evaluate factors that could impact foot angle and claw set scores. Specifically, this study evaluated the interaction of sex and age on claw set and foot angle scores of front or hind legs. Researchers used the American Angus Association (AAA) Foot Scoring Guidelines to subjectively analyze claw set and foot angle on a series of Montana Angus herds, scoring a total of 4,723 cattle: 1,475 yearling bulls, 992 yearling heifers, 1,044 two- and three-year-old cows, and 1,212 cows that were four years and older. The AAA Foot Scoring Guidelines require breeders to score the combined "worst foot" for both the claw set and foot angle traits on a hard, flat surface. Yearling bulls had a 0.12 and 0.20 greater mean foot angle and claw set score compared to yearling heifers (P < 0.01). The proportion of scores that differed from 5 (ideal foot score) were greater (P < 0.01) for front feet as compared with hind feet, with 61.5 and 74.5% of the scores not equal to 5 being front feet issues for yearling heifers and bulls, respectively. Foot angle scores increased linearly (P < 0.01) with advancing cow age, ranging from 5.15 to 5.80 for heifers versus cows 4 years and older, respectively. Likewise, claw set scores increased quadratically (P < 0.01) as a function of cow age. The location of the "worst foot" also changed quadratically with age (P < 0.01) with the majority of problem feet in 2/3 year old cows and cows 4 years and older being hind feet issues (70.5 and 77.1% respectively). The proportion of foot angle and claw set scores not equal to 5 also increased quadratically with age (P < 0.01) with heifers having the lowest proportion of scores not equal to 5 (15.8 and 31.7%, respectively) compared to 4 yr and older cows (66.0 and 68.0%, respectively). In analysis of progeny of sire lines, the range between the progeny of the sire line with the greatest foot angle score and that of the least was 0.60 for foot angle. Likewise, for claw set, a similar range 0.57 was observed. Sire lines did have an effect on progeny claw set (P < 0.05) and foot angle scores (P < 0.05), as well as variation of progeny foot scores. In summary, progress is being made by utilizing the AAA foot scoring guidelines, as well as foot angle and claw set EPDs. Additional improvements may be possible with continued model refinement and improvements with scoring guidelines specific to age and sex effects.
  • Thumbnail Image
    Item
    Leveraging a global spring, 2-row barley population to accelerate the development of superior forage barley varieties for Montana growers
    (Montana State University - Bozeman, College of Agriculture, 2021) Hoogland, Traci Janelle; Chairperson, Graduate Committee: Jamie Sherman
    As more people around the globe escape poverty, they are eating more meat and dairy products. To support this increased demand for animal products there is an urgent need to develop more sustainable high-quality forage and hay crops for the livestock production industry. Barley (Hordeum vulgare spp. vulgare L.) is considered one of the most drought tolerant of the annual cereals and spring barley has been shown to out yield established perennial forages under drought conditions in central Montana (Cash, Surber, & Wichman, 2006). To accelerate the development of superior forage barley varieties for Montana, the following goals were identified 1) Utilize a genome wide association analysis to find genetic regions related to key forage and agronomic traits, 2) Use statistical modeling to a) examine the relationship between difficult to measure forage traits such as quality and yield, and easy to measure agronomic traits such as flowering time and plant height, b) identify agronomic traits that can be used as proxies for yield and quality in the earliest stages of the breeding program when genetic and phenotypic variability are at their greatest. Through these techniques the importance of variation in timing of plant maturity was identified. Statistical modeling showed that variability in forage yield and quality was observed to be closely related to variability in the timing of heading and soft-dough dates. Plant height was also determined to be of importance especially for biomass yield. Through genome-wide association analysis, novel QTL were discovered in relation to all studied traits. QTL were detected on all seven chromosomes and the majority of forage trait QTL co-located with QTL related to the timing and progression of plant development and maturity. This appeared to indicate that in a population of global barley accessions, the loci with the greatest impact on forage traits may be those containing genes regulating plant development and senescence. This further strengthened the evidence from the modeling study that a relationship exists between the two trait categories: traits for measuring the timing of plant development and forage traits.
  • Thumbnail Image
    Item
    Food resources for grizzly bears at army cutworm moth aggregation sites in the Greater Yellowstone Ecosystem
    (Montana State University - Bozeman, College of Agriculture, 2022) Lozano, Katerina N.; Chairperson, Graduate Committee: Robert K. D. Peterson; This is a manuscript style paper that includes co-authored chapters.
    Army cutworm moths (Euxoa auxiliaris) (ACM) migrate annually to peaks on the eastern edge of the Greater Yellowstone Ecosystem (GYE). Grizzly bears (Ursus arctos horribilis) feed on these moths from mid-to late summer. The Shoshone Forest is preparing a management plan to address the conservation of these sites and foraging bears. Increased human use and GYE-wide changes in grizzly bear food availability and related foraging patterns are concerns prompting plan preparation. This study addresses grizzly bear diet and vegetation foraging locations on a prominent moth site ('South Site'). A 1991 study identified 4 forb genera utilized by bears at ACM sites. A 2017-2018 study identified 5 more and postulated that biscuitroot (Lomatium spp.), found in high elevation meadows, was an important resource for grizzly bears. During 2020-2021 we clarified these findings using scat collection and descriptions of available vegetation. We determined the frequency and volume of food items in 298 scats. We quantified vegetation at peak meadows (elevation: 3,078 - 3,657-m) and in cirque basins (elevation: 3,658 - 3,931-m) to record the percent cover of nine forb genera. We also described the density of biscuitroot and craters where bears excavated roots to determine if biscuitroot influences foraging site choices for grizzly bears. We confirmed use of 7 of the 9 previously identified forb genera. The most frequently consumed foods by grizzly bears were ACM (23% volume) and roots and tubers (38% volume). Similarly, the 2017-2018 study found 20% ACM by volume and 45% roots and tubers by volume. There was a positive, linear relationship between the density of flowering biscuitroot and craters from grizzlies digging roots in several peak meadows (p < 0.001). Rather than foraging solely on ACMs, grizzly bears on this moth site relied highly on vegetation in their diet, specifically roots and tubers from biscuitroot and clover. Our findings suggest grizzly bears have a diverse diet at this moth site that may allow them to adjust to variations in ACM abundance. They focused foraging on roots and tubers at 5 peak meadows near talus where moth foraging occurs; information that can potentially help mitigate human-grizzly bear interactions involving climbers.
  • Thumbnail Image
    Item
    Impact of increasing NaCl levels in livestock drinking water on the intake and utilization of low-quality forages by beef cattle
    (Montana State University - Bozeman, College of Agriculture, 2022) Nack, Makae Frances; Chairperson, Graduate Committee: Timothy DelCurto; This is a manuscript style paper that includes co-authored chapters.
    Water is one of the most important nutrients for livestock production (Petersen et al., 2015) but its quality is often overlooked in western range settings. The western United States and more specifically, Montana, are prone to variable precipitation and droughts, reducing the quantity and quality of livestock drinking water as well as limited forage quality/quantity. Dormant season grazing of rangeland forages often involves utilizing self-fed, salt limited supplements to meet cattle nutrition requirements and better utilize forage. Self-fed supplements commonly add salt as an intake limiter because it is effective, cheap and necessary in beef cattle diets (Cardon et al., 1951). The objective of this study was to evaluate the impacts of increasing NaCl levels in water on low quality forage intake, digestibility, and rumen fermentation of cattle consuming low quality forage. Eight, ruminally-cannulated, Angus crossbred cows were individually stalled and used in two 4 x 4 Latin squares design. One square was hand fed a non-salt supplement; the second square was fed a salt limiting supplement. Two cows (one from each square) were assigned to one of four water treatments per period: 1) control, no added NaCl; 2) 1000 mg NaCl/L; 3) 2000 mg NaCl/L; and 4) 3000 mg NaCl/L. A 14-day adaption period allowed cattle to acclimate to the water; followed by a 6-day total collection period. Rumen fluid samples were collected on day 22 at hours 0, 4, 8, 12, 18 and 24; and on day 23 a total rumen evacuation was conducted to determine total rumen volume and collect rumen content samples. Increasing levels of NaCl did not influence intake in either study (P > or = 0.36). Rumen pH was influenced by water NaCl in study 1 (P = 0.01), however, post hoc analysis revealed no differences. Volatile fatty acids in both studies were not affected by NaCl in either study (P > or = 0.39). Our results suggest the NaCl levels in our study may have little influence on intake, rumen fermentation and liquid kinetics suggesting NaCl levels up to 3,000 are safe for cattle.
  • Thumbnail Image
    Item
    Evaluation of sustained release mineral boluses as a long-term nutrient delivery method for beef cattle
    (Montana State University - Bozeman, College of Agriculture, 2021) Carlisle, Tanner Jay; Chairperson, Graduate Committee: Timothy DelCurto; Samuel A. Wyffels, Steve D. Stafford, Anna R. Taylor, Megan L. Van Emon and Timothy DelCurto were co-authors of the article, 'Evaluation of sustained release mineral boluses as a long-term nutrient delivery method for beef cattle' in the journal 'Animal feed science and technology' which is contained within this thesis.
    Two studies were conducted to evaluate the efficacy of sustained release mineral boluses as an alternative nutrient delivery method for beef cattle. For both studies 16 ruminally-cannulated cows were used in a completely randomized design. In study 1, we evaluated degradation rates of two bolus prototypes and cow age (2-yr-old versus 3-yr-old cows) over an 87-d study period. In study 2, we evaluated two bolus types (90-d degradation target versus 180-d degradation target), as well as two diet qualities contrasting low-quality high-fiber forage (> 65% NDF and < 8% CP) and high-quality low-fiber forage (< 55% NDF and > 15% CP). For both Studies intake and digestion periods were conducted to evaluate cow age (study 1) or diet quality (study 2) effects on intake and rumen/reticulum function. In study 1, models containing an asymptotic effect of day and an interaction between day and bolus type received virtually all support of candidate models for bolus degradation rate. Cow age did not affect bolus degradation rates (Beta = -0.81 + or - 1.13; P= 0.48) and degradation rates were greater for bolus prototype B compared to bolus A (Beta prototype B = -20.39 + or - 1.13; Beta prototype A = -9.64 + or - 0.81; P < 0.01). In study 2, models containing a linear effect of day and an interaction between day and diet received virtually all support of candidate models for the degradation rate of the 90-d and 180-d prototype. In addition, both bolus protoypes displayed a diet quality + or - time interaction (P < 0.01) for bolus degradation rate. Cattle treated with the 90-d bolus and fed a high-quality diet had greater degradation rates (Beta High-quality = -2.64 + or - 0.08; Beta Low-quality = -1.97 + or - 0.10; P < 0.01) than the cows that were fed a low-quality diet. In contrast, cattle treated with the 180-d bolus expressed greater degradation rates (Beta Low-quality = -0.09 + or - 0.007; Beta High-quality = -0.04 + or - 0.005; P < 0.01) with cows on the low-quality diet versus the high-quality diet. Across both studies, two of four bolus prototypes met target release rates at 90 days. However, bolus degradation characteristics varied and were influenced by diet quality.
  • Thumbnail Image
    Item
    Salt limited intake: impacts of salt level and form of supplement on intake, nutrient digestion, and variability of supplement intake in beef cattle
    (Montana State University - Bozeman, College of Agriculture, 2021) White, Hayley Christina; Co-chairs, Graduate Committee: Megan Van Emon and Timothy DelCurto (co-chair); M. L. Van Emon, H. M. DelCurto-Wyffels, S. A. Wyffels and T. DelCurto were co-authors of the article, 'Impacts of form of salt-limited supplement on supplement intake behavior and performance with yearling heifers grazing dryland pastures' submitted to the journal 'Journal of feed science & technology' which is contained within this thesis.; M. L. Van Emon, H. M. DelCurto-Wyffels, S. A. Wyffels and T. DelCurto were co-authors of the article, 'Impacts of increasing levels of salt on intake, digestion, and rumen fermentation with beef cattle consuming low-quality forages' submitted to the journal 'Journal of animal science' which is contained within this thesis.
    For centuries, salt has been used as a cost effective intake-limiter of supplements for ruminants. Beef cattle production in the western United States relies on self-fed, salt-limited supplement to offset seasonal nutrient deficiencies which, in turn, may improve performance and increase forage intake. However, research has found high variation in individual supplement intake among animals and across days. If cattle are over consuming high-salt diets, this may result in negative impacts on animal performance and additional cost for the producer. Two studies were conducted to evaluate the effects of form of supplement on supplement intake behavior, body weight, and body condition change and the impacts of supplemental salt levels on forage intake, water intake, dry matter digestibility, and rumen fermentation of yearling heifers consuming low quality forages. During a two-year summer grazing trial, individual supplement intake, time spent at the feeder, and frequency of visits was measured. It was found that supplementation and form of supplement did not influence heifer weight gain or intake CV (P = 0.26), but heifers in the pelleted treatment consumed more supplement (grams/kg BW), and at a faster rate compared to heifers fed the loose supplement form (P < 0.01). In study 2, six ruminally cannulated heifers were assigned to treatments to determine the effect of salt levels on digestibility and rumen parameters. Salt treatments consisted of: 1) control, no salt (CON), 2) 0.05% of BW salt (LOW), and 3) 0.1% of BW salt (HIGH). Forage and water intake, digestibility, and rumen parameters were measured. Supplemental salt tended to decrease forage intake (grams/kg BW; P = 0.06) and tended to increase DM fill (P = 0.07). Both water intake and liquid fill increased with increasing level of salt (P < 0.01). Ruminal pH and ammonia levels decreased with increasing salt (P < 0.01) while acetate concentration increased (P < 0.01). Digestibility was not influenced by salt levels (P > 0.05). Our results suggest that pelleting salt-limited supplements has a masking effect on the intake regulation of salt. Additionally, increasing levels of salt modifies rumen fermentation and digestion suggesting lower efficiency of intake and use with high-salt diets.
  • Thumbnail Image
    Item
    The impacts of supplementing rumen degradable or undegradable protein to heifers and cows on supplement intake behavior, performance, reproduction, and nutrient digestion
    (Montana State University - Bozeman, College of Agriculture, 2021) Manoukian, Marley Kathryn; Chairperson, Graduate Committee: Megan Van Emon; J.A. Kluth, S.A. Wyffels, T. DelCurto, C. Sanford, T.W. Geary, A. Scheaffer and M.L. Van Emon were co-authors of the article, 'Impacts of rumen degradable or rumen undegradable protein supplement on supplement intake behavior, performance, and reproductive parameters with yearling heifers and cows grazing dryland pastures' submitted to the journal 'Journal of animal science' which is contained within this thesis.; J.A. Kluth, S.A. Wyffels, T. DelCurto, A. Scheaffer and M.L. Van Emon were co-authors of the article, 'Impacts of rumen degradable or rumen undegradable protein supplement with or without salt on nutrient digestion and VFA concentrations' submitted to the journal 'Journal of animal science' which is contained within this thesis.
    Low-quality forages, often low in protein, are a common feed resource for beef cattle in Montana and the western United States. A supplement intake study, as well as a digestion study were performed to observe the effects of rumen degradable protein (RDP) and rumen undegradable protein (RUP) on supplement intake behavior, performance, reproductive parameters, nutrient digestion, and rumen kinetics. Yearling heifers were used in a completely randomized design and two- and three-year old cows were used in a randomized complete block design for an 84-d supplement intake study. Treatments were: 1) pressed supplement block containing RUP (RUP), and 2) pressed supplement block containing RDP (RDP). Heifer and cow supplement intake displayed (P < 0.01) a treatment x period interaction. Cow intake rate and coefficient of variation displayed (P < 0.01) a treatment x period interaction. The RUP heifers consumed supplement faster (P < 0.01) than RDP heifers. The RDP cows had greater (P < 0.01) average daily gains than RUP cows. The RUP cows had greater final pregnancy rates than RDP cows (P = 0.04). In conclusion, protein type impacted intake behavior in cows and heifers, and RDP cows had ADG, but protein type did not negatively impact final performance or pregnancy success. Eight two-year old and eight three-year old rumen fistulated cows were used in a 2 x 2 factorial design for a 22-d digestion study. Animals were fed an ad libitum low-quality diet. Supplements included either RDP or RUP and were self-fed (SF) salt-limited pressed blocks or hand-fed (HF) the same loose ingredients without salt resulting in 4 dietary treatments: 1) RUP-SF, 2) RUP-HF, 3) RDP-SF, and 4) RDP-HF. There was a delivery x protein type interaction (P ? 0.04) for both NDF digestibility and water intake. There was an effect (P = 0.02) of protein type on fluid flow rate. Ruminal ammonia displayed (P < 0.01) a delivery x protein type x hour interaction. Valerate ruminal concentrations were greater in RDP supplemented animals compared to RUP supplemented animals (P = 0.04). In conclusion, self-fed supplements containing RDP may enhance the use of low-quality forages.
  • Thumbnail Image
    Item
    Effects of mountain pine beetle on elk habitat and nutrition in the Elkhorn Mountains of Montana
    (Montana State University - Bozeman, College of Letters & Science, 2018) Cascaddan, Brent Morris; Chairperson, Graduate Committee: Robert A. Garrott
    Mountain pine beetle (Dendroctonus ponderosae, MPB) outbreaks have become increasingly prevalent in western North America, resulting in ecological changes in pine forests that have important implications for wildlife populations and habitat. The potential effects of MPB-caused tree mortality on ungulate populations and habitat are relatively unstudied, and the possibility exists for both beneficial changes to ungulate habitat such as increased production of forage (i.e., forage availability) through the opening of the forest canopy and negative impacts such as accelerated phenology of herbaceous plants that may reduce forage quality. Using data collected during 2015 - 2017 in MPB-impacted National Forests in west-central Montana, I quantified the effects of MPB outbreaks on elk summer forage resources and use. To accomplish this objective, I 1) evaluated differences in herbaceous plant communities between mature uninfested lodgepole pine stands and two temporal classes of MPB-impacted forest stands (i.e., lodgepole pine cover classes: mature uninfested, old infested: > or = 10 years old, recent infested: <10 years old), 2) evaluated differences in elk summer forage availability and herbaceous vegetation quality, and 3) compared current elk use of lodgepole cover classes (2015 - 2017) to a previous elk telemetry study conducted during 1980 - 1991 before the MPB epidemic. I found that herbaceous forage plant communities did not differ in plant species composition but did differ in forage abundance in each cover class. Forage abundance was significantly different between cover classes and was highest in the old-infested cover class, and lowest in the mature uninfested cover class. The dominant phenology stage of forage species did not change across cover classes by a biologically meaningful amount, but herbaceous quality differed across cover classes, however the amount of difference was small. During the 2015 - 2017 study, elk used all three lodgepole pine cover classes in proportion to how much of each cover class was available. Elk use of lodgepole pine during the 1980 - 1991 study was approximately double what was estimated to be available and suggests elk are using the beetle-killed forest less than prior to infestation. My results indicate MPB does not negatively affect elk nutrition during later summer (July and August), and active management of beetle-killed forest is not necessary for the benefit of elk during this time period, but may be needed for improving elk habitat in other ways during other times of year.
  • Thumbnail Image
    Item
    Space use and foraging patterns of the white-headed woodpecker in western Idaho
    (Montana State University - Bozeman, College of Letters & Science, 2017) Kehoe, Adam Roarke; Chairperson, Graduate Committee: Jay J. Rotella
    The white-headed woodpecker (Picoides albolarvatus) is a species of conservation concern that is strongly associated with ponderosa pine (Pinus ponderosa)-dominated forests in the Inland Northwest. More information on home range size and habitat selection patterns is needed to inform conservation of the white-headed woodpecker, a focal management species for dry-forest restoration treatments. We examined whether home range size was associated with food resources and if fine-scale habitat characteristics influenced selection of foraging sites. During the post-fledging periods of 2014 and 2015, we radio-tracked 11 white-headed woodpeckers in forests of west-central Idaho. These forests were historically managed for timber harvest, resulting in removal of large-diameter, cone-producing ponderosa pine trees. We hypothesized that ponderosa pine cones would be a highly-valued food resource providing seeds and arthropods. We expected smaller home ranges to be associated with a greater availability of cones for foraging and that cone foraging would be concentrated in core use areas. We used foraging behavior to test this hypothesis, specifically, the proportion of time foraging on cones as an index of cone availability. Home range sizes ranged from 24 to 169 ha (90% fixed-kernel estimates). Consistent with our hypothesis, individuals with relatively small home ranges spent a greater proportion of foraging time on cones (Beta superscript 1 [SE] = 2.48[1.32], P = 0.096; Beta superscript 2 [SE] = -5.00[1.61], P = 0.014). Cone foraging was also higher in core use areas compared to home range peripheries for individuals exhibiting at least moderate cone foraging. We also expected foraging woodpeckers to favor larger diameter pines in sites with moderate to high canopy closure. To test this hypothesis, we analyzed foraging-site selection by comparing habitat characteristics between foraging trees and available trees, which provided support for our foraging site prediction (Beta superscript TREEDIAMETER [SE] = 3.50[0.43], P <0.001; Beta superscript CANOPY [SE] = 1.74[0.41], P <0.001; Beta superscript SPECIES [SE] = 1.43[0.33], P <0.001). Our results suggest that large diameter pines provide important foraging resources, and that landscapes with more productive cone crops could support greater numbers of white-headed woodpeckers. We recommend restoration treatments that retain high-density patches of large diameter pines while promoting mosaics of open and closed canopies at larger spatial scales.
Copyright (c) 2002-2022, LYRASIS. All rights reserved.