Scholarly Work - Animal & Range Sciences

Permanent URI for this collectionhttps://scholarworks.montana.edu/handle/1/8931

Browse

Search Results

Now showing 1 - 10 of 10
  • Thumbnail Image
    Item
    Habitat Relations of Mule Deer in the Texas Panhandle.
    (1985-11-27) Koerth, Benjamin H.; Sowell, Bok F.; Bryant, Fred C.; Wiggers, Ernie P.
    Telemetric observations of mule deer (Odocoileus hemionus) does were used to determine seasonal relationships between deer use and availability of habitats on two study areas in the Panhandle region of west Texas. Juniper breaks was the only type used in greater proportion than its availability on the Canadian River study area (CRSA). On the Clarendon study area (CSA) mule deer shifted seasonal preferences between riparian, cultivated fields, and juniper breaks. Annual and seasonal home ranges were considerably larger on the CSA. Larger home ranges and variability in seasonal use of the habitat types on the CSA were attributed largely to the presence of cultivated winter grain fields. Shifts in home ranges coincided with the season when production in the cultivated fields was highest and native forage availability was lowest. Comparisons of indirect deer observations (pellet groups and bed sites) with random measurements within each habitat type characterized deer use sites as east and north facing slopes, located close to a canyon rim, and receiving light livestock and human use. The placement of cultivated grain fields could be used to influence overall range use and attract deer to or away from localized sites. Also, consideration should be given to directing heavy livestock and human traffic away from sites that are used by mule deer.
  • Thumbnail Image
    Item
    Protein Supplementation and 48-Hour Calf Removal Effects on Range Cows
    (1992-03) Sowell, Bok F.; Wallace, Joe D.; Parker, Eugene E.; Southward, Morris G.
    In 1984, 99 Angus × Hereford cows (4- to 6-yr-olds) were assigned randomly to a 4-yr, 2 × 2 factorial study. Treatment assignment was permanent, and no new cows were added during the study. By 1987, 71 cows remained, and over-all, 335 complete cow-calf data sets were used. Main effect treatments were beginning time (prepartum [PRE] vs postpartum [POST]) for crude protein (CP) supplementation (twice weekly feeding of 41% CP cottonseed meal pellets at 1.58 kg • $\text{cow}^{-1}$ • $\text{feeding}^{-1}$) and temporary calf removal (48 hour [48-H] vs 0 hour [CONT]) just before the breeding season. For analyses, sex of calf was included as a third main effect (2×2×2) and year was included as a random factor; the 4-way interaction served as the testing term for repeated measures over years. Year was the dominant source of variation for most traits; we attributed this mainly to different amounts and timing of precipitation among years. Very few interactions were observed. The PRE supplemented cows had reduced (P<0.01) spring body weight losses and higher prebreeding body condition scores (4.9 vs 4.5; P<0.01) compared with POST cows. Reproductive performance did not differ between PRE and POST cows. Use of 48-H calf removal vs CONT did not influence (P>0.10) reproductive traits measured. Likewise, 48-H treatment did not impair health or reduce weaning weights of calves. In a separate, within-year analysis used to examine age of dam effects, productivity of 4-yr-old cows during 1984 was slightly below that of older cows for some traits. Cow age effects were not detected in other years. We conclude that control cows in our study were approaching optimum fertility and production levels in concert with their environment and that improvement beyond these levels with the treatments imposed was unlikely.
  • Thumbnail Image
    Item
    Application of Feeding Behaviour to Predict Morbidity of Newly Received Calves in a Commercial Feedlot.
    (2001-09) Quimby, W. F.; Sowell, Bok F.; Bowman, J. G. P; Branine, M. E.; Hubbert, M. E.; Sherwood, H. W.
    The objective of this study was to use feeding behavior of newly received steers (average initial weight 191 kg) to detect morbidity in animals in a commercial feedlot. Two separate 32 d feeding trials were conducted in Wellton, Arizona, in July and November 1996. Radio frequency technology was used to record the total time spent within 50 cm of the feedbunk (animal presence every 5.25 s times 5.25 s) in 3 h intervals from 0600 to 2400 on a daily basis for 103 and 122 male calves in trial 1 and 2, respectively. Statistical procedures based on the cumulative sums (CUSUM) of the 3 h feeding intervals were used to detect morbid animals, compared with detection of animals deemed morbid by experienced pen riders. In trial 1, the CUSUM procedure detected animal morbidity 4.5 d earlier (P < 0.001) than the feedlot personnel. In trial 2, the CUSUM procedure detected animal morbidity 3.7 d earlier (P < 0.001) than feedlot pen riders. Overall accuracy, positive predictive value and sensitivity of the CUSUM prediction method were 87, 91, and 90%, respectively. Combined trial data suggest that feeding behavior during the first 30 d cattle are in a receiving pen, as collected with radio frequency technology and analyzed with CUSUM charts, may be used to detect animal morbidity approximately 4.1 d earlier (P < 0.001) than conventional methods typically employed in commercial feedlots.
  • Thumbnail Image
    Item
    Horses and Cattle Grazing in the Wyoming Red Desert, II. Dietary Quality.
    (1984-05) Krysl, L. J.; Sowell, Bok F.; Hubbert, M. E.; Plumb, G. E.; Jewett, T. K.; Smith, M. A.; Waggoner, J. W.
    Botanical composition of horse and cattle diets from fecal analysis and nutrient quality of hand-harvested forages used by these herbivores were evaluated to assess dietary quality during the summer and winter seasons of 1981 in the Wyoming Red Desert. Dietary crude protein estimates averaged 7.5±0.1% and 9.0±0.5% during the summer for horses and cattle, respectively. Dietary crude protein estimates in the winter were lower, averaging 6.1±0% and 6.0±0% for horses and cattle, respectively. Estimated dietary calcium levels for both herbivores were high through the summer and winter, while dietary phosphorus levels appear to be deficient during both seasons. Average in vitro dry matter disappearance coefficients for horses and cattle during the summer were 52±2% and 52±2%, respectively. During the winter these values dropped to 39±1% and 40±1% for horses and cattle, respectively.
  • Thumbnail Image
    Item
    Estimating Digestibility and Faecal Output in Lambs Using Internal and External Markers.
    (1988-08-01) Krysl, L. J.; Galyean, M. L.; Estell, R. E.; Sowell, Bok F.
    Twenty fine-wool, ruminally cannulated lambs (average weight 45–9 kg) were used in a completely random design to evaluate the ability of three internal markers to predict dry matter digestibility and two external markers to estimate faecal output. Lambs were allotted randomly to one of four diets: 100% prairie hay (PH), 100% lucerne hay (LH), 50% prairie hay:50% sorghum grain (PS) and 50% lucerne hay: 50% sorghum grain (LS). The trial consisted of a 14-day adaptation period followed by a 7-day total faecal collection period. Feed and faecal samples were subjected to 96 h ruminal fluid and 48 h acid-pepsin digestions, followed by extraction with acid detergent (IVADF) or neutral detergent (IVNDF) solution. Dry matter digestibility (DMD) calculated from feed:faeces ratios of IVADF, IVNDF and acid detergent lignin (ADL) was compared with in vivoapparent digestibility. Ytterbium-labelled forage (YLF) and dysprosium-labelled faeces (DLF) were pulse-dosed via ruminal cannulae, and faecal Yb and Dy excretion curves were fitted to a one-compartment, agedependent model for estimation of faecal output, paniculate passage rate (PPR) and mean gastrointestinal retention time. In vivoDMD in lambs fed PH was greater (P <005) than DMD calculated from IVNDF, IVADF and ADL. In lambs fed LH and LS, in vivoDMD did not differ (P >005) from marker estimates. In vivoDMD for lambs fed PS did not differ from IVNDF or IVADF estimates but was greater than (P <005) the ADL estimate. No differences (P> 005) were observed in recovery among the three internal markers for any of the diets. Faecal output for lambs fed PH did not differ {P >005) from marker estimates but was overestimated by 15 to 20% by YLF and DLF. Faecal output for lambs fed LH was similar to the estimate from YLF, but less than (P <0–05) the estimate with DLF. For lambs fed PS, faecal output did not differ from marker estimates, but YLF and DLF values were 16% lower and 17% higher, respectively. No significant differences were observed in actual and estimated faecal output for lambs fed the LS diet. Estimates of PPR with DLF were numerically greater than YLF estimates for all diets except LS. Correspondingly, mean gastrointestinal retention time was less (P <005) for DLF compared with YLF for all diets except LS.
  • Thumbnail Image
    Item
    Horses and Cattle Grazing in the Wyoming Red Desert
    (1984-01) Krysl, L. J.; Hubbert, M. E.; Sowell, Bok F.; Plumb, G. E.; Jewett, T. K.; Smith, M. A.; Waggoner, J. W.
    The sagebrush-grass range in southcentral Wyoming presently supports large numbers of feral horses and domestic livestock. Diets of feral horses and cattle during summer and winter grazing were evaluated using fecal analysis under 2 stocking levels in small pastures. Horses and cattle consumed primarily grasses during the summer and winter. However, shrubs and forbs were also important dietary components. Needleandthread, Sandberg bluegrass, thickspike wheatgrass, Indian ricegrass, gray horsebrush, and winterfat were the major foods of horses and cattle during the summer and winter. Dietary overlap between horses and cattle during the summer averaged 72% and increased to 84% during the winter. Horses and cattle selected foods in a similar order.
  • Thumbnail Image
    Item
    Influence of Ruminally Dispensed Monensin and Forage Maturity on Intake and Digestion.
    (1993-05) Fredrickson, Eddie L; Galyean, M. L.; Branine, M. E.; Sowell, Bok F.; Wallace, Joe D.
    Eight ruminally cannulated crossbred steers (average weight 336 kg) grazing native blue grama (Bouteloua gracilis [H.B.K.]) rangeland were used in a repeated measures design to evaluate effects of monensin ruminal delivery devices (MRDD) and forage phenology on ruminal digestion. Three periods were assessed: mid-August (Aug.), early October (Oct.), and mid-November (Nov.). One MRDD was placed in the reticulum of 4 steers via the ruminal cannula 21 days before each period. Intake was estimated using total fecal collections. Diet samples were collected using 3 esophageally fistualted steers. Ruminal fill was measured by ruminal exacuation; rate and extent of in situ ruminal neutral detergent fiber disappearance were estimated before ruminal evacuations. Ruminal passage rates, retention time, and apparent total tract organic matter digestibility were estimated using indigestible neutral detergent fiber. In vitro organic matter disappearance of esophageal masticate did not differ (P>.05) in Aug. and Oct. (average of 53.7%), but declined (P<.05) in Nov. (48.7%), whereas organic matter digestibility was greater (P<.10) in Aug. (62.3%) than in either Oct. (55.2%) or Nov. (53.9%). Release of monensin from the bolus (68 mg day-1) was less than expected (100 mg day-1). Intake, organic matter digestibility, ruminal passage rates, retention time, pH, and ammonia were not affected (P>.10) by MRDD. In situ neutral detergent fiber disappearance at 96 hours was decreased (P<.10) by MRDD (68 vs 65% for control and MRDD, respectively). As the grazing season progressed, intake declined (P<.10), whereas ruminal fill and retention time increased (P<.05), and passage rate of indigestible neutral detergent fiber decreased (P<.05). At 48 hours in situ neutral detergent fiber was greatest (P<.05) in Aug. and least (P<.05) in Nov.
  • Thumbnail Image
    Item
    Liquid supplementation for ruminants fed low-quality forage diets: a review
    (1995-09) Bowman, J. G. P.; Sowell, Bok F.; Paterson, J. A.
    Forty-three studies involving liquid supplementation of cattle and sheep consuming low-quality forages were identified, summarized in tabular form and reviewed. All studies that could be found containing animal gain, forage intake and (or) supplement consumption with molasses-urea supplements under grazing conditions were reviewed. Seven studies were found which compared forage intake or animal performance by animals fed hay or straw and molasses-urea supplements with unsupplemented animals. Molasses-urea supplements did not increase forage intake or animal performance compared with unsupplemented animals in five of the seven studies. Thirteen studies were identified which evaluated performance of grazing animals receiving molasses-urea supplements compared with unsupplemented animals. Seven of these 13 grazing studies reported improved live weight change when animals received molasses-urea supplements. Only two grazing studies were found which evaluated forage intake by animals consuming molasses-urea supplements and compared it with unsupplemented animals. Both studies found no effect. Five of six studies identified that compared molasses-urea supplements with dry supplements, forage intake or animal live weight change were not increased by molasses-urea supplements over dry supplements. Most authors concluded that feeding molasses-urea supplements to grazing ruminants was not as profitable as feeding dry supplements; however, few studies reported economic data. Studies demonstrated that level of molasses and nitrogen influenced animal performance. Asynchrony between molasses and nitrogen resulted in animal weight loss. Most positive animal responses were seen with a combination of high levels of molasses and nitrogen. However, these results may have been influenced by supplement formulation. Performance and intake results were confounded by pasture condition, forage quality, animal variation and supplement delivery system. In four studies that measured supplement intake by individual animals, between 1 and 20% of experimental animals refused to consume any molasses-urea supplement. Quantification of supplement intake and animal feeding behavior has not been adequately addressed in the literature.
  • Thumbnail Image
    Item
    Maladaptive nest-site selection by a sagebrush dependent species in a grazing-modified landscape
    (2019-04) Cutting, Kyle A.; Rotella, Jay J.; Schroff, Sean R.; Frisina, Michael R.; Waxe, James A.; Nunlist, Erika; Sowell, Bok F.
    Animals are expected to select habitats that maximize their fitness over evolutionary time scales. Yet in human-modified landscapes, habitat selection might not always lead to increased fitness because animals undervalue high-quality resources that appear less attractive than those of lower quality. In the American West, agriculture has modified landscapes, yet little is known about whether agricultural changes alter the reliability of the cues animals use to identify habitat quality; ultimately forming maladaptive breeding strategies where behavioral cues are mismatched with survival outcomes. Using the greater sage-grouse, a species highly dependent upon sagebrush landscapes, we (1) evaluated how females select nesting habitats based on sagebrush type, along with livestock grazing related linear and point features, and other biotic, abiotic characteristics, given hypothesized influences on hiding cover, microclimate and predator travel routes and perches, (2) compared habitat selection information with results for nest survival estimates to evaluate if selection appears to be adaptive or not, and (3) used our results to evaluate the most appropriate strategies for this species in a grazing-modified landscape. Nest-site selection for sagebrush type appears to be maladaptive: in the most-preferred sagebrush type, nest survival rate was one-fourth the rate realized by females nesting in the sagebrush type avoided. Nest survival was four times higher for nests placed away from (>100 m), rather than next to (1 m), the nearest fence, and survival was lower within sites with higher cow pie density (a proxy for previous grazing intensity). Live and dead grasses influenced selection and survival in opposing ways such that dead grass was selected for but resulted in reduced survival while live grass was avoided but resulted in increased survival. Results collectively provide the first empirical evidence that a specific type of sagebrush acts as an ecological trap while another sagebrush type is undervalued. These results also suggest that adding more fences to control livestock grazing systems will likely reduce sage-grouse nest survival.
  • Thumbnail Image
    Item
    Beaver Habitat Selection for 24 Yr Since Reintroduction North of Yellowstone National Park
    (2018-01) Scrafford, Matthew A.; Tyers, Daniel B.; Patten, Duncan T.; Sowell, Bok F.
    Beavers (Castor canadensis) disappeared from drainages north of Yellowstone National Park in the mid-1900s because of trapping, a potential tularemia outbreak, and willow (Salix spp.) stand degradation by ungulates. Beavers were reintroduced in 1986 after a 40-yr absence with inventories of active-beaver structures completed each fall after reintroduction for 24 consecutive yr. We used this inventory to evaluate the expansion of beaver populations in a riparian environment recovering from past overuse by ungulates. Specifically, we investigated the density of active-beaver colonies and dams, the change in willow cover, and habitats associated with beaver expansion since reintroduction. Successful establishment and expansion of beavers indicate that sufficient resources were available to the population despite the suboptimal condition of riparian vegetation. Carrying capacity on third-order streams was reached approximately 14 yr after reintroduction (2000) with an average annual density of 1.33 (95th percentile = 1.23 - 1.44 active colonies/stream km) between 2000 and 2010. The average annual density of beaver dams during this time was 2.37 (2.04 - 2.71 active dams/stream km). Despite the beaver population being at carrying capacity in meadows since 2000, willow cover increased by 16% between 1981 and 2011. We speculate that beaver activities, together with reduced ungulate browsing from predation and habitat loss, combined to increase willow cover. Willow cover and height were positively associated with colony longevity, but numerous other influencing variables included secondary channels, sinuosity, stream depth, and sandbar width. Our results provide evidence that beaver reintroduction can be successful in riparian areas where willow stand condition is less than optimal and that beavers might ultimately improve willow condition. We suggest that reducing ungulate use of overgrazed riparian environments will facilitate the reestablishment of beaver populations. We also provide managers with habitats that should be identified in an environment targeted for reintroduction.
Copyright (c) 2002-2022, LYRASIS. All rights reserved.