Scholarship & Research
Permanent URI for this communityhttps://scholarworks.montana.edu/handle/1/1
Browse
2 results
Search Results
Item Using the Balanced Scorecard as a Framework for Strategic Planning and Organizational Change(2020) Johnson, Kris; Arlitsch, Kenning; Kyrillidou, Martha; Swedman, DavidStrategic planning processes offer an opportunity to connect foundational practices with a vision for future change. In this chapter, Kotter’s eight stages of change are mapped to the Montana State University Library’s strategic planning effort (September 2017– February 2018). Montana State University (MSU) is a land-grant public research university located in Bozeman, Montana. It is listed in the Carnegie Classification as a doctoral-granting university with “Higher Research Activity,” and with a head count of nearly 17,000 students in Fall 2018, it is by far the largest institution of higher education in Montana. The university’s annual budget is $201 million, and research and development expenditures exceeded $126 million in 2018. In addition to having its teaching and research missions, MSU is also one of 359 universities in the US awarded Carnegie’s community engagement classification.Item Measuring Up: Assessing Accuracy of Reported Use and Impact of Digital Repositories(2014-02) Arlitsch, Kenning; OBrien, Patrick; Kyrillidou, Martha; Clark, Jason A.; Young, Scott W. H.; Mixter, Jeff; Chao, Zoe; Freels-Stendel, Brian; Stewart, CameronWe propose a research and outreach partnership that will address two issues related to more accurate assessment of digital collections and institutional repositories (IR). 1. Improve the accuracy and privacy of web analytics reporting on digital library use 2. Recommend an assessment framework and web metrics that will help evaluate digital library performance to eventually enable impact studies of IR on author citation rates and university rankings. Libraries routinely collect and report website and digital collection use statistics as part of their assessment and evaluation efforts. The numbers they collect are reported to the libraries’ own institutions, professional organizations, and/or funding agencies. Initial research by the proposed research team suggests the statistics in these reports can be grossly inaccurate, leading to a variance in numbers across the profession that makes it difficult to draw conclusions, build business cases, or engender trust. The inaccuracy runs in both directions, with under reporting numbers as much a problem as over reporting. The team is also concerned with the privacy issues inherent in the use of web analytics software and will recommend best practices to assure that user privacy is protected as much as possible while libraries gather data about use of digital repositories. Institutional Repositories have been in development for well over a decade, and many have accumulated significant mass. The business case for institutional repositories (IR) is built in part on the number of downloads of publications sustained by any individual IR. Yet, preliminary evidence demonstrates that PDF and other non-HTML file downloads in IR are often not counted because search engines like Google Scholar bypass the web analytics code that is supposed to record the download transaction. It has been theorized that Open Access IR can help increase author citation rates, which in turn may affect university rankings. However, no comprehensive studies currently exist to prove or disprove this theory. This may be due to the fact that such a study could take years to produce results due to the publication citation lifecycle and because few libraries have an assessment model in place that will help them to gather data over the long term. We plan to recommend an assessment framework that will help libraries collect data and understand root causes of unexplained errors in their web metrics. The recommendations will provide a foundation for reporting metrics relevant to outcomes based analysis and performance evaluation of digital collections and IR.