Does COUNTER tell the whole story? Case-by-case examples - - PowerPoint PPT Presentation

does counter tell the whole story
SMART_READER_LITE
LIVE PREVIEW

Does COUNTER tell the whole story? Case-by-case examples - - PowerPoint PPT Presentation

Does COUNTER tell the whole story? Case-by-case examples demonstrating the limitations of COUNTER, and suggestions for alternative evaluation metrics CARLI Spring Forum on Collections Data Analysis and Maintenance Governors State University,


slide-1
SLIDE 1

Does COUNTER tell the whole story?

Case-by-case examples demonstrating the limitations of COUNTER, and suggestions for alternative evaluation metrics

CARLI Spring Forum on Collections Data Analysis and Maintenance Governors State University, April 28th, 2017 Jonathan Shank Acquisitions & E-Resources Librarian Northwestern University Galter Health Sciences Library

slide-2
SLIDE 2

Disclaimer

slide-3
SLIDE 3

Institutional Context

  • Serves Northwestern's Feinberg School Medicine in Chicago
  • Administratively separate from University Library in Evanston
  • Cost sharing with Evanston on big deal agreements
  • Separate standalone subscriptions and a medical specific collection
  • Member of CARLI, but not part of I-Share or union Voyager catalog
  • Entire NU system migrated to Alma in Summer of 2015
  • Galter maintains custom Primo front-end
  • Currently in transitional phase for handling of COUNTER
  • No ERMS or usage client, efforts currently focused on JR1 stats
  • Usage functionality coming to Alma this summer

Galter Health Sciences Library

3

slide-4
SLIDE 4

COUNTER usage statistics

  • Standard format, impressive data set

and BIG numbers

  • “Consistency” across vendors
  • Ease of utilizing for CPU analysis
  • Increasing compliance among

vendors

  • Growing interoperability
  • Iterative improvements with each

new release

4

What works well

Flickr

slide-5
SLIDE 5

COUNTER usage statistics

Active and engaged community of librarians, publishers and vendors.

What works well

5

slide-6
SLIDE 6

COUNTER usage statistics

  • Merging multiple providers and platforms, unless you have an aggregator

client (i.e. Ustat, 360 Resource Manager, CORAL, etc)

  • Manual retrieval of reports
  • Still necessary despite major improvements from SUSHI
  • Login credentials must be stored & maintained, difficult with shared licenses
  • Issues with accuracy and title consistency with historical titles and title

changes, splits and merges

  • Stats may be inaccurate or useless as a result

What doesn’t work so well

6

slide-7
SLIDE 7

COUNTER usage statistics

  • Occasional issues with accuracy,

compliance and reliability

  • Overlapping accounts, IP ranges and

multiple access points can inflate or deflate numbers

  • Lack of distinction by location, school

department, or affiliation

  • Not available for some resources

7

What doesn’t work so well

RZstar Production

slide-8
SLIDE 8

COUNTER usage statistics

Individual usage is a relatively flat or static indicator of impact and value.

“Statistics are a measurement of users’ actions that we try to correlate to their intentions.”

Oliver Pesch, EBSCO Publishing

What doesn’t work so well

8

slide-9
SLIDE 9

Specific Examples

Demonstrating the limitations of COUNTER

5/1/2017

slide-10
SLIDE 10

Example 1: Inflated numbers

  • Some platforms load HTML full text automatically, if user clicks PDF it can be

counted twice

  • Some linking mechanisms like CrossRef allow publishers to choose linking level,

i.e. link to TOC, abstract, html, pdf

  • COUNTER is continuously working to improve and resolve these issues
  • Publisher interference, or at the very least, optimization for high stats, still

possible

Numbers can be inflated by a publisher’s interface & platform design

10

slide-11
SLIDE 11

Example 2: IP issues

  • On the vendor side, most usage in

COUNTER reports is ultimately attributed to accounts based on IP addresses

  • According to a recent study/audit:

58% of IPs held by publishers to authenticate libraries are wrong (Spence, PSI Ltd)

11

Incorrect IP information can distort figures

Vincari Blog

slide-12
SLIDE 12

Example 3: Problems distinguishing locations

  • IPs often overlap between departments, schools and campuses, making usage

indistinguishable by location

  • NU has campuses in Evanston, Chicago and Qatar with overlapping IPs
  • Content at NU is licensed by several different entities for different groups of

users

  • Accounts themselves also have overlap in locations and access entitlements,

which are lumped together in COUNTER “There is no single way [outlined in the COUNTER code of practice] for providers to categorize usage transactions to capture reporting by subsets.”

  • Project COUNTER

COUNTER still has limitations with location or account specific reporting

12

slide-13
SLIDE 13

Example 3: Problems distinguishing locations

GHSL Licenses EMBASE ClinicalKey Accesses NUL Licenses ScienceDirect Scopus Cell Press Accesses NMH Accesses LCH Licenses ClinicalKey Nursing Accesses

Overview of NU’s Elsevier landscape

13

slide-14
SLIDE 14

Example 4: Lack of context or normalization

Undergraduate student padding out works cited for English 101 paper Vs. faculty conducting research for major grant or high impact publication Usage and information-seeking behaviors may vary widely by discipline, research area, or department

Not all usage is created equal, but it’s treated equally

14

screengrabber.deadspin.com

slide-15
SLIDE 15

Example 5: False negatives

  • Journal is licensed by Galter Library through Elsevier’s ClinicalKey
  • Showed only 1 full text download in ClinicalKey’s 2016 JR1
  • Citation analysis indicated journal was cited 46 times by NU scholars in same

time period, obvious discrepancy

  • Title is also available through NUL’s ScienceDirect Freedom Collection
  • 397 full text downloads in ScienceDirect’s 2016 JR1

Journal of Dermatological Science

15

slide-16
SLIDE 16

Alternative usage metrics

Substitutes and supplements for COUNTER

5/1/2017

slide-17
SLIDE 17

Alternative usage metrics

  • Pros
  • Data is potentially stored in one place with a single access point
  • Possibility to capture user affiliation, domain or location
  • Integration with Google Analytics or other log analysis tools
  • Cons
  • Initial set up is manual, and can be complicated
  • Some programming knowledge may be required
  • Not all traffic goes through proxy (on campus, VPN, etc.)
  • Not all institutions have a single proxy server

Proxy logs

17

slide-18
SLIDE 18

Alternative usage metrics

  • Pros
  • Can be much easier to retrieve, depending on your resolver
  • Alma has some functionality built in to Analytics, more coming with next release
  • Generally found to correlate closely with COUNTER stats
  • Potential to capture user affiliation, domain, and/or location
  • Cons
  • Manual setup may be required
  • Does all of your traffic really go through the link resolver?
  • Galter routes PubMed traffic back to customized resolver

Link resolver logs, stats and analytics

18

slide-19
SLIDE 19

Alternative usage metrics

  • Pros
  • Identifies usage based on actual research output; demonstrates impact
  • Depending on how it’s collected, data can be normalized and contextualized

by school, subject or research area

  • Could identify low use, high impact titles and save them from cancellation
  • Cons
  • Not as useful for non-research oriented institutions (i.e. liberal arts &

community colleges)

  • Doesn’t capture scholarly usage outside of publishing
  • Galter team currently working on project in this area for NASIG, stay tuned!

Citation data

19

slide-20
SLIDE 20

Main takeaways

  • Useful to have multiple evaluation

metrics to check against

  • Outliers or anomalies from one

metric can be investigated further with others

  • Different metrics for different titles
  • Institutional context plays a large role
  • Systems, licensing, and locations
  • Mission of school, level of research

activity

20

No single metric is a silver bullet

APMEX

slide-21
SLIDE 21

Questions?

slide-22
SLIDE 22

Thank You!

j-shank@northwestern.edu @ShankLib

slide-23
SLIDE 23

References

Bennett, N., (2015). “Could we ever get rid of usage statistics?.” Insights. 28(1), pp.83–84. DOI: http://doi.org/10.1629/uksg.222 Davis, P. M. and J. S. Price (2006). "eJournal interface can influence usage statistics: Implications for libraries, publishers, and Project COUNTER." Journal of the American Society for Information Science and Technology 57(9): 1243-1248. De Groote, S. L., Blecic, D. D., & Martin, K. (2013). “Measures of health sciences journal use: a comparison of vendor, link- resolver, and local citation statistics.” Journal of the Medical Library Association: JMLA, 101(2), 110. Haustein, S. (2012). Multidimensional Journal Evaluation: Analyzing Scientific Periodicals beyond the Impact Factor, De Gruyter. Kennedy, M. R. and C. LaGuardia (2013). Marketing Your Library's Electronic Resources: A How-To-Do-It Manual for Librarians, American Library Association. Orcutt, D. (2010). Library Data: Empowering Practice and Persuasion, Libraries Unlimited. Rathemacher, Andrée J. (2010). “E-Journal Usage Statistics in Collection Management Decisions: A Literature Review.” Library Data: Empowering Practice and Persuasion, ed. Darby Orcutt, 71-89, Libraries Unlimited. Stamison, C., Niemeyer, T., & Tucker, C. (2009). "Usage Statistics: The Perks, Perils and Pitfalls." Proceedings of the Charleston Library Conference. http://dx.doi.org/10.5703/1288284314761

23