Skip to main content
Stack of magazines
This fall, data-driven libraries (and librarians) take the spotlight in the ProQuest Blog. Look for this helpful series starting in September.  
Sara E. Morris, Associate Content Development Librarian and Lea Currie, Head of Content Development at University of Kansas Libraries
Working with a flat collections budget for more than seven years the University of Kansas (KU) Libraries has been forced to identify databases and journals for cancellation.  Initially, it was relatively easy to find “low hanging fruit” for savings, but it has been increasingly difficult to identify enough savings to carry the expense of large journal packages year after year.  
At the end of 2016, one of the largest journal packages will come up for renewal.  While usage data provided by the publisher demonstrates that this particular package is still a “good deal,” it also receives the lowest usage compared to other large journal packages.  
This journal package is a target for cancellation, but not before a long and tedious evaluation process.   
This particular publisher was unwilling to negotiate a smaller package deal at a more affordable rate, so with no reasonable alternative we had to begin the cancellation process by making a case to answer any questions that might arise about our decision-making process.
Usage data, obviously
Usage data is an obvious place to start when evaluating the importance of electronic resources and is a corner stone for making retention decisions. In the old days, librarians looked at check-out cards or the hash tag statistics made from re-shelving volumes from the carts next to the photocopier. 
In today’s electronic library environment, vendors provide usage data.  Although the data is COUNTER compliant, in the end it is “apples and oranges” when comparing different publishers or vendors. The ability to count clicks, searches, or PDF and HTML downloads can be supplemented with all kinds of other data beyond usage. Although each evaluation is different, depending on the type of collection, we have learned to look well beyond the most basic benchmarks and use a multitude of different types of data and information to help make collection decisions.
The most recent journal package to be up for renewal has presented KUL librarians with an opportunity to creatively gather evaluation data.  Going into this assessment project, we believed canceling the package was a strong reality and to be well prepared to share this with the campus we needed to be comprehensive, leaving no stone unturned.  
Historically, Acquisitions and Content Development worked together to evaluate large journal packages. If the entire package was to be canceled this process consisted of calculating cost per use, estimated price increases, potential interlibrary loan costs, and other obvious metrics.  
Data Deep Dive
Once again we started with the same metrics. As expected, it became obvious when looking at only the usage data that the adage of 80/20 held true.  A small portion of the journals generated the highest use, some had middling usage, but a significant portion had little to no use.  Right away this indicated the need to focus on the most heavily used titles—the top 500 titles used in the last 3 years.  
Using only a small portion of the title list meant we could devote significantly more time to collecting data than in the past.  It gave us freedom and time to be creative. Our spreadsheet filled up with the following data for each of the top 500 journals:
- Number of aggregators that provide access to a journal title
- Aggregator’s dates of full text 
- Aggregator’s length of embargo 
- Titles from aggregators that cover a journal title
- Linda Hall Library access (As a member of the Center for Research Libraries (CRL), their serial collections are available via interlibrary loan)
- 2014 Impact Factors
- 5 year Impact Factors
Some of this data was far more useful than others. Through collecting the aggregator information, we determined that many of these titles had a one-year embargo. Consequently, not only were we paying for the title twice, but essentially committing to the journal package for just one year of access.  Establishing which aggregator subscriptions are most valuable for providing access to the top 500 journals is useful in determining whether to eliminate the package.
While it is interesting to look at Impact Factors, this data was not that useful in determining whether or not to cancel the package.  Perhaps we will use it in the future, but for now we haven’t focused on it.  
We also came to the quick conclusion that our CRL membership and its provision to provide titles from Linda Hall  through interlibrary loan was not what we had hoped for, since Linda Hall does not subscribe to many of our most used titles in the journal package.
New Kinds of Data 
Two years ago, as part of a journal cancellation project, we developed a web site with a list of proposed cancellations for faculty review.  Faculty who served in editorial capacities for those journals were quick to alert us.  A few of them gave us push back on the cancellations. 
While canceling this package might be inevitable, during a discussion we decided to be proactive and try to determine if KU faculty are represented on other editorial boards.  In the past, collecting this type of data would have taken forever.  But the recent push to collect research metrics on campus gave us a new way to learn about editorial service on campus. 
At our request, KU’s Faculty Professional Record Online (PRO) office generated a list of all self-reported editorial responsibilities in 2015. This list was long, but our super skilled student employee wrote a computer program matching titles from the journal package with the list generated by PRO. 
To ensure data quality we looked at each journal’s editorial board listed on their webpage. Although this data will not play a significant role in our decision to cancel journals, it is still valuable. 
We identified some of KU’s Foundation Distinguished Professors and other highly productive faculty members that serve on editorial boards as part of this exercise. With this list of faculty, we can be proactive in explaining to them why a title they help edit is no longer available before a general message is shared with the entire campus.  
Communication with the KU Campus
Early on in our evaluation of this journal package, the Dean of Libraries and the Director of Development and Communications for the KU Libraries were brought into the conversation.  The Dean determined that open communication with KU faculty was imperative throughout this process. He immediately talked with the Provost to make her aware of what was happening and why we could no longer support this large package of journals.  
Subsequently, the Provost asked the Dean to communicate our actions with the other deans on campus.  Our Dean drafted an open letter to the deans explaining why this cancellation was necessary, given the annual subscription increases and our flat budget.  
With the aid of the data we collected, the Director of Development and Communications will create a web site that explains the financial problems that beset our library. It will list the top 500 journals from the package with the availability through aggregators and instructions for gaining access to journal articles through our interlibrary loan service.  
A feedback form will be provided to help us identify essential titles that we can reinstate after access to the journal package is no longer available.  
During the fall 2016 semester, we also plan to send a survey to faculty and graduate student listservs asking them to list the journal titles essential to their research.  
We will model our survey after a survey that was developed by our colleagues at the University of Montreal.  The intent of this survey is to identify essential journal titles from all publishers, not titles provided by this particular journal package.  Today and in the future, we aim to support the research of our faculty and students by providing access to the most important journals in their disciplines.
Our advice to libraries who face similar circumstances is to start the evaluation process as early as possible.  
Anticipate questions that might arise, both internal and external to the libraries.  
Be creative developing a case for cancelling journal packages and openly communicate the evaluation process with your constituents.  
Communication is key to identifying the journals that are essential for the research taking place on your campus and making the process successful.
Lea Currie is the Head of Content Development at the University of Kansas (KU) Libraries and has been with KU since 1999.  Lea came to Kansas after getting her library degree from the University of Texas at Austin and her B.A. from Texas A&M University.
Sara E. Morris is the Associate Content Development Librarian at the University of Kansas.  Prior to this position she was KU’s American History Librarian.  Sara holds a PhD in History from Purdue University and an MLS from Indiana University.

Sara E. Morris, Associate Content Development Librarian and Lea Currie, Head of Content Development at University of Kansas Libraries

Working with a flat collections budget for more than seven years the University of Kansas (KU) Libraries has been forced to identify databases and journals for cancellation. Initially, it was relatively easy to find “low hanging fruit” for savings, but it has been increasingly difficult to identify enough savings to carry the expense of large journal packages year after year.  

At the end of 2016, one of the largest journal packages will come up for renewal. While usage data provided by the publisher demonstrates that this particular package is still a “good deal,” it also receives the lowest usage compared to other large journal packages.  

This journal package is a target for cancellation, but not before a long and tedious evaluation process.

This particular publisher was unwilling to negotiate a smaller package deal at a more affordable rate, so with no reasonable alternative, we had to begin the cancellation process by making a case to answer any questions that might arise about our decision-making process.

Usage data, obviously

Usage data is an obvious place to start when evaluating the importance of electronic resources and is a cornerstone for making retention decisions. In the old days, librarians looked at check-out cards or the hash tag statistics made from re-shelving volumes from the carts next to the photocopier. 

In today’s electronic library environment, vendors provide usage data. Although the data is COUNTER compliant, in the end, it is “apples and oranges” when comparing different publishers or vendors. The ability to count clicks, searches, or PDF and HTML downloads can be supplemented with all kinds of other data beyond usage. While each evaluation is different, depending on the type of collection, we have learned to look well beyond the most basic benchmarks and use a multitude of different types of data and information to help make collection decisions.

The most recent journal package to be up for renewal has presented KUL librarians with an opportunity to creatively gather evaluation data. Going into this assessment project, we believed canceling the package was a strong reality and to be well prepared to share this with the campus we needed to be comprehensive, leaving no stone unturned.  

Historically, Acquisitions and Content Development worked together to evaluate large journal packages. If the entire package was to be canceled this process consisted of calculating cost per use, estimated price increases, potential interlibrary loan costs, and other obvious metrics.  

Data Deep Dive

Once again we started with the same metrics. As expected, it became obvious when looking at only the usage data that the adage of 80/20 held true. A small portion of the journals generated the highest use, some had middling usage, but a significant portion had little to no use. Right away this indicated the need to focus on the most heavily used titles—the top 500 titles used in the last 3 years.  

Using only a small portion of the title list meant we could devote significantly more time to collecting data than in the past. It gave us freedom and time to be creative. Our spreadsheet filled up with the following data for each of the top 500 journals:

- Number of aggregators that provide access to a journal title

- Aggregator’s dates of full text 

- Aggregator’s length of embargo 

- Titles from aggregators that cover a journal title

- Linda Hall Library access (As a member of the Center for Research Libraries (CRL), their serial collections are available via interlibrary loan

- 2014 Impact Factors

- 5 year Impact Factors

Some of this data was far more useful than others. Through collecting the aggregator information, we determined that many of these titles had a one-year embargo. Consequently, not only were we paying for the title twice but essentially committing to the journal package for just one year of access. Establishing which aggregator subscriptions are most valuable for providing access to the top 500 journals is useful in determining whether to eliminate the package.

While it is interesting to look at Impact Factors, this data was not that useful in determining whether or not to cancel the package. Perhaps we will use it in the future, but for now, we haven’t focused on it.

We also came to the quick conclusion that our CRL membership and its provision to provide titles from Linda Hall through interlibrary loan was not what we had hoped for since Linda Hall does not subscribe to many of our most used titles in the journal package.

New Kinds of Data 

Two years ago, as part of a journal cancellation project, we developed a website with a list of proposed cancellations for faculty review. Faculty who served in editorial capacities for those journals were quick to alert us. A few of them gave us push back on the cancellations. 

While canceling this package might be inevitable, during a discussion we decided to be proactive and try to determine if KU faculty are represented on other editorial boards. In the past, collecting this type of data would have taken forever. But the recent push to collect research metrics on campus gave us a new way to learn about editorial service on campus. 

At our request, KU’s Faculty Professional Record Online (PRO) office generated a list of all self-reported editorial responsibilities in 2015. This list was long, but our super skilled student employee wrote a computer program matching titles from the journal package with the list generated by PRO. 

To ensure data quality we looked at each journal’s editorial board listed on their webpage. Although this data will not play a significant role in our decision to cancel journals, it is still valuable. 

We identified some of KU’s Foundation Distinguished Professors and other highly productive faculty members that serve on editorial boards as part of this exercise. With this list of faculty, we can be proactive in explaining to them why a title they help edit is no longer available before a general message is shared with the entire campus.  

Communication with the KU Campus

Early on in our evaluation of this journal package, the Dean of Libraries and the Director of Development and Communications for the KU Libraries were brought into the conversation. The Dean determined that open communication with KU faculty was imperative throughout this process. He immediately talked with the Provost to make her aware of what was happening and why we could no longer support this large package of journals.  

Subsequently, the Provost asked the Dean to communicate our actions with the other deans on campus. Our Dean drafted an open letter to the deans explaining why this cancellation was necessary, given the annual subscription increases and our flat budget.  

With the aid of the data we collected, the Director of Development and Communications will create a website that explains the financial problems that beset our library. It will list the top 500 journals from the package with the availability through aggregators and instructions for gaining access to journal articles through our interlibrary loan service.  

A feedback form will be provided to help us identify essential titles that we can reinstate after access to the journal package is no longer available.  

During the fall 2016 semester, we also plan to send a survey to faculty and graduate student listservs asking them to list the journal titles essential to their research.  

We will model our survey after a survey that was developed by our colleagues at the University of Montreal. The intent of this survey is to identify essential journal titles from all publishers, not titles provided by this particular journal package. Today and in the future, we aim to support the research of our faculty and students by providing access to the most important journals in their disciplines.

Our advice to libraries who face similar circumstances is to start the evaluation process as early as possible.  

Anticipate questions that might arise, both internal and external to the libraries.  

Be creative developing a case for canceling journal packages and openly communicate the evaluation process with your constituents.  

Communication is key to identifying the journals that are essential for the research taking place on your campus and making the process successful.

Lea Currie is the Head of Content Development at the University of Kansas (KU) Libraries and has been with KU since 1999. Lea came to Kansas after getting her library degree from the University of Texas at Austin and her B.A. from Texas A&M University.

Sara E. Morris is the Associate Content Development Librarian at the University of Kansas. Prior to this position, she was KU’s American History Librarian. Sara holds a PhD in History from Purdue University and an MLS from Indiana University.

09 Nov 2016

Related Posts

Library computer stations

A Librarian’s Guide to Becoming Data Driven

An interview with Michael Levine-Clark, Dean and Director of the University of Denver Libraries.…

Learn More

Library computer stations

Hunt & Gather: The Fundamentals of Collection Assessment

Hunt & gather, verify, analyze, and disseminate are the 4 steps of collection assessment, which focus both on the collection and the use/user.…

Learn More

Two images of parking signs.

The Big 3 in Building a Collection Assessment Program

Driving my daily work are three guiding principles: make it easy, make it relevant, and make it interesting.…

Learn More

Search the Blog

Archive

Follow