Scholarly Work - Library
Permanent URI for this collectionhttps://scholarworks.montana.edu/handle/1/320
Browse
Item Measuring Up: Assessing Accuracy of Reported Use and Impact of Digital Repositories(2014-02) Arlitsch, Kenning; OBrien, Patrick; Kyrillidou, Martha; Clark, Jason A.; Young, Scott W. H.; Mixter, Jeff; Chao, Zoe; Freels-Stendel, Brian; Stewart, CameronWe propose a research and outreach partnership that will address two issues related to more accurate assessment of digital collections and institutional repositories (IR). 1. Improve the accuracy and privacy of web analytics reporting on digital library use 2. Recommend an assessment framework and web metrics that will help evaluate digital library performance to eventually enable impact studies of IR on author citation rates and university rankings. Libraries routinely collect and report website and digital collection use statistics as part of their assessment and evaluation efforts. The numbers they collect are reported to the libraries’ own institutions, professional organizations, and/or funding agencies. Initial research by the proposed research team suggests the statistics in these reports can be grossly inaccurate, leading to a variance in numbers across the profession that makes it difficult to draw conclusions, build business cases, or engender trust. The inaccuracy runs in both directions, with under reporting numbers as much a problem as over reporting. The team is also concerned with the privacy issues inherent in the use of web analytics software and will recommend best practices to assure that user privacy is protected as much as possible while libraries gather data about use of digital repositories. Institutional Repositories have been in development for well over a decade, and many have accumulated significant mass. The business case for institutional repositories (IR) is built in part on the number of downloads of publications sustained by any individual IR. Yet, preliminary evidence demonstrates that PDF and other non-HTML file downloads in IR are often not counted because search engines like Google Scholar bypass the web analytics code that is supposed to record the download transaction. It has been theorized that Open Access IR can help increase author citation rates, which in turn may affect university rankings. However, no comprehensive studies currently exist to prove or disprove this theory. This may be due to the fact that such a study could take years to produce results due to the publication citation lifecycle and because few libraries have an assessment model in place that will help them to gather data over the long term. We plan to recommend an assessment framework that will help libraries collect data and understand root causes of unexplained errors in their web metrics. The recommendations will provide a foundation for reporting metrics relevant to outcomes based analysis and performance evaluation of digital collections and IR.