Making the Data Visible: Ramping Up Library Reporting with LibraryTrac

draft library trac

Last fall, I was desperate for a robust library attendance management system.  As soon as I appealed to my friends on Twitter for help, fellow school librarian  Margaux DelGuidice immediately responded and recommended we try LibraryTrac.  I was immediately sold and signed up right away.  Since November 2015, many of you have heard me sing the praises of LibraryTrac, a tool that has streamlined our attendance sign-in process for our library while giving us the ability to not only track student visits but other kinds of data related to student use of the library as well. Here is how Scott Allen, owner and developer of LibraryTrac, describes the service:

LibraryTrac is an application that allows libraries/media centers to keep track of their daily users and why those users are coming to use the library. The application allows librarians to designate reasons for using the library, as well as document what teacher students are coming from.  It allows librarians to collect and analyze logged in user statistics. A librarian can view the amount of users over a period of time, in addition to particular days. If reasons for using the library are created, statistics will be generated to show how many users were in the library for those reasons during a a time period. Furthermore, librarians can create scheduled time frames to keep statistical data for by setting up pre-determined start and end times.

Not only can students sign in on multiple workstations with synchronized data, but they can even sign in using an easy to generate QR code.  The platform is easy to use, and tech support is always just an email away.   My wonderful library assistant, Carol Olson, says, “I love LibraryTrac.  The learning curve for staff is gentle, and it’s simple for the kids to use as well.” You can configure your reasons to meet your library’s needs and then collect data on how students are using the library throughout the data.  Here is a sample report I’ve just run today that gives you an idea of how students have been using our library since November 2015:

reasons library trac for the year

As you can see, the majority of our students (nearly 20,000 since we started using the application in mid-November 2015!) come for quiet study or to work on an assignment/homework individually.  This kind of data obviously informs how we will think about student needs for space, furniture, programming, and services in 2016-2017.

Our teachers and administrators love that LibraryTrac provides accountability for student visits as we can easily check who has visited us and when; you can also provide teachers the ability to check this data with a password protected link.

If you are interested in trying out LibraryTrac at no charge for the remaining days of this 2015-16 school year, Scott Allen is offering a free trial at this time!  You can contact him with the information in the flyer included in this post.

Southeastern Library Assessment Conference 2015 Session Roundup: Correlation Between Library Instruction and Student Retention

CC image from

CC image from

Note:  This is the second in a series of posts on sessions I attended at the Southeastern Library Assessment Conference on November 17.  Please see the first post with more background information here.

Session Description

Grand Valley State University Libraries has found a statistically significant positive correlation between librarian-led instruction and student retention. The libraries’ Head of Instructional Services and the university’s Institutional Analysis department worked together to connect library instruction with student-level data and then analyze that data over several academic years. This presentation will outline the data collection process, analysis methods, limitations, and outcomes of a longitudinal study.

Session Notes

The most interesting session we attended  was “Correlation Between Library Instruction and Student Retention” from Mary O’Kelly, Librarian and Head of Instructional Services at Grand Valley State University.  Here are my notes and reflections from one of the best, most interesting, and most thoughtful conference sessions I have ever attended during my career as a librarian.

O’Kelly became the Head of Instructional Services in 2012 and very quickly began thinking about assessment planning.  Major questions she wanted to pursue included:

  • What questions do we have about our instructional program?
  • What data do we need to those questions?
  • How will we get it?
  • Who will analyze and interpret the results?

She also determined that she would need help gathering reliable data and expert help with the data analysis.    She conducted literature reviews in these broad areas:

  1.  Relationships between library services and student retention
  2.   High-impact educational practices
  3.  Practices to impact student retention in positive way

O’Kelly was particularly interested in retention as part of her research because:

  • Retention is a significant issue in higher education.
  • Retention is a priority at her institution and in higher education at large.
  • She found nothing in the literature about information literacy instruction and student retention (though there were other kinds of studies on correlation between non-instructional aspects of library service and correlation).
  • She felt it was a top priority for the library to demonstrate its values and impact on larger institutional goals.
  • She wanted to see if the data would support library instruction and collaboration between faculty and librarians as a high impact practice.

Research Methods

O’Kelly and her staff used LibAnalytics from Springshare to collect library instruction data.   For each guide used with library instruction, staff entered:
A.  the course code, course number, and section in separate fields
B.  The professor name and librarian name
C.  Other:  date, location, duration, content
D.  Attendance was not taken; absences were within a margin of error

This research endeavor presented an opportunity to establish a new relationship with the university’s Office of Institutional Analysis.  O’Kelly worked closely with Rachael Passarelli, the office’s research analyst.  Together they began with 23 questions about the library’s instructional program.  Initially, they used some of these questions to adjust instruction and to think about important data points for instruction:

  • Which programs were not being reached?
  • How many students were reached at each grade level?
  • What percentage of instruction sessions were in the library?
  • What is the distribution of the number of sessions over the course of an academic year?

After developing the of working list of 23 questions and conducting her literature review,  the one that bubbled to the top was focused on the relationship between information literacy instruction and student retention.This initial pass at the data led them to the big research question:

Of the students who saw a librarian in class, what percentage of them re-enrolled for the following fall compared to students who did not see a librarian? 

The null hypothesis:  there is no relationship between library instruction and student retention.  Retention was defined as re-enrollment the following fall semester.

After the data was collected over the course of an academic year, the LibAnalytics dataset and questions were sent to Passarelli and the Office of Institutional Analysis.  Passarelli pulled student records from course enrollment in the LibAnalytics data as part of her analysis.  Only courses with at least one library session were analyzed; she also used the chi-square test of independence using SAS.  A fixed p-value of .05 was used to test significance; a general linear model was used to control for ACT scores, high school GPA, socioecnomic status, and first generation status.  The research project was IRB exempt since privileged student data was stripped before the data analysis was sent back to the library.


The results over three years of data collection showed the following:

data results

As you can see, the findings were replicable over three years and statistically significant.  In addition, the magnitude increased each year.   Consequently, the null hypothesis has been rejected.   Data also shows the highest retention correlation with freshmen and graduate students.  In order to triangulate data and rule out faculty effect, a recent follow-up analysis that compared the retention of students whose faculty had a librarian come into any of their classes compared to faculty who did not (analysis was by faculty, not by student).  This follow-up analysis also showed a significant correlation, p-value=.0001.


  • LibAnalytics record input is subject to human error
  • Attendance is estimated
  • Online instruction is excluded
  • Results cannot be generalized to other institutions
  • Retention is not persistence to graduation
  • Reasons why students withdraw are often complicated
  • Correlation is not causation (my note:  I am deeply appreciative of this distinction since so many library studies of recent years crow as though their results are causation when in fact, they are not.)

Discussion/Next Steps

One of the limitations of library use studies is the student motivation factor.  For O’Kelly’s study intrinsic motivation for library engagement is removed because whole-class data was used.  In addition, the large sample size is a strength of this study.  O’Kelly wants to further explore why students are using the library and to consider the role of course assignments (that are given by faculty) in library use.  At this time, the library instruction is very targeted because it is based on the school core curriculum, not the ACRL standards/framework.

Because faculty are including a librarian in assignment design and delivery, they are introducing the library as an academic support service to students.  In light of her research showing faculty engagement with instruction librarians is correlated with student retention and student participation in library instruction is correlated with student retention, O’Kelly now wonders, “What’s the cause?”  She now wants to test this working hypothesis:

Faculty engagement with library instruction is a high impact practice that positively affects student retention.

O’Kelly will be publishing her research in early 2016; I will be share to alert you when her formal study is published.  For now, you can see her slides here.  I was most impressed by the depth of thought and how she tried to cover every angle possible with her research methodology.  As I stated earlier, I also appreciate that she is stresses her research shows correlation, not cause, a distinction I think is often lost in library studies in people’s efforts to advance advocacy agendas.  The other attendees were also clearly impressed with her research methodology, attention to detail, and the clear and deliberate way she communicated her work.    The session left me thinking about how her efforts might inspire my own research as a high school librarian and what data points matter to my local school and learning community.  I hope to write more about where this may lead or go in the spring of 2016.

Southeastern Library Assessment Conference 2015: Introduction and Space Assessment Session 1

library asssessment conf

My friend and former Norcross High colleague Jennifer Lund and I attended the Southeastern Library Assessment Conference on November 16 that was held at the historic Georgian Terrace Hotel in Atlanta, Georgia.  Though we were probably the only school librarians there, we felt welcome and gleaned many pearls of wisdom from the sessions we attended.  I was sadly only able to attend Day 1 (Monday, 11/16) due to district meetings I needed to attend on the second day (11/17), but I got MORE than my money’s worth from the sessions I attended.  I highly recommend this conference if you are looking for smart, thoughtful perspectives that are grounded in evidence based practice and data collection with integrity.  The conference was limited to 125 people and had a pleasant, intimate feel; in addition, we were served a gourmet lunch buffet (it was fabulous) and many delicious amenities throughout the day (Starbucks coffee, tea, water, sodas, cookies).  Many thanks to the conference organizers who did a fantastic job with every aspect of the conference—it is by far one of the best and most meaningful conference experiences I’ve had in my career—every session had substance.

This is the first in a series of posts on the sessions Jennifer and I attended on Monday, November 16, 2015.

Space Assessment: How They Use It, What They Want, Sara DeWaay, University of North Carolina, Charlotte

Session Description:  Getting student input on the library space can be a multi-layered effort. Come hear about the methods used to get an understanding of use patterns, as well as the students’ desires for a small branch library, as we work to transition towards a flexible space.

My Notes:
The emphasis was on users and feedback from students; Sara thought about the feedback in terms of “low cost easy” vs. “high cost hard” solutions and ideas from the students.  When she began the group study, she thought of the library space in zones:  group study, circulation area, lounge, quiet study, flexible, and creativity.  She began by doing a literature review on space assessment, and she focused on both qualitative and quantitative assessment methods.  She also looked at space assessment from a “before” and “afterwards” perspective since assessment should continue after the space remodel or redesign is initially completed.  She also did research on user centered design.  She formed a Student Advisory group; positive aspects of this group included input, support, connection, and ownership for the students, but challenges were maintaining momentum and a sustained sense of meaningfulness for the students after their participation ended.  In the future, Sara would try to make sure students received some sort of course credit for participation, perhaps as part of a project based learning assignment related to space design.

She organized a student event where students could come and vote on designs; approximately 40-50 students participated.  She basically used big notepads where students could vote with sticky notes on larger sheets of bulletin board or flip chart paper housed on easels.  For example:


She also used flip charts to get feedback from students using open-ended questions; she interspersed the flip charts with the buffet of food to “guide” them to this part of the feedback session.    Students also had a chance to mark up floor plans; she provided them a variety of tools for doing this activity including crayons, sharpies, ballpoint pens, colored pencils, and regular pencils.  Students then could tape their proposed floor plan on the wall.  Afterwards, she coded the feedback from the student floor plans using categories like “atmosphere” (and specific elements assigned something like letters A-J) and “physical space” (specific aspects were numbered 1-14).  This method of floor plan coding then allowed her to look at the data in a “layered” way (example:  2B).

Another strategy was student surveys.  Unfortunately, her sample size of 40 was not ideal, but nonetheless, she was able to ask more detailed questions about services as well as questions about the library in comparison to other spaces in the building.  She also had library student assistants help track space use; using iPads and Suma, they were able to gather data and plug it into LibAnalytics to get a better idea of space usage.

Once she looked at all the data, she was able to better understand student needs and could classify possible changes and redesign elements into these categories:

  • Low cost/easy to do
  • Low-cost/difficult to do
  • High cost/easy to do
  • High cost/ difficult to do

Unfortunately, the budget for the renovation was put on hold, but if it moves forward, Sara would get faculty input in the future and do similar activities with staff.  The major takeaway for me from this session was the idea of space assessment as cyclical—it should be ongoing and is important to do even after you complete a renovation or redesign project to make sure the new space is continuing to work for students or to see what areas of new need/adjustment may be needed.  This idea was especially helpful for Jennifer and me since she has opened a new library space, and I’m in the middle of working on a redesign project for the library here at Chattahoochee High.

My next post will be about the second session we attended on battling survey fatigue.