Making the Data Visible: Ramping Up Library Reporting with LibraryTrac

draft library trac

Last fall, I was desperate for a robust library attendance management system.  As soon as I appealed to my friends on Twitter for help, fellow school librarian  Margaux DelGuidice immediately responded and recommended we try LibraryTrac.  I was immediately sold and signed up right away.  Since November 2015, many of you have heard me sing the praises of LibraryTrac, a tool that has streamlined our attendance sign-in process for our library while giving us the ability to not only track student visits but other kinds of data related to student use of the library as well. Here is how Scott Allen, owner and developer of LibraryTrac, describes the service:

LibraryTrac is an application that allows libraries/media centers to keep track of their daily users and why those users are coming to use the library. The application allows librarians to designate reasons for using the library, as well as document what teacher students are coming from.  It allows librarians to collect and analyze logged in user statistics. A librarian can view the amount of users over a period of time, in addition to particular days. If reasons for using the library are created, statistics will be generated to show how many users were in the library for those reasons during a a time period. Furthermore, librarians can create scheduled time frames to keep statistical data for by setting up pre-determined start and end times.

Not only can students sign in on multiple workstations with synchronized data, but they can even sign in using an easy to generate QR code.  The platform is easy to use, and tech support is always just an email away.   My wonderful library assistant, Carol Olson, says, “I love LibraryTrac.  The learning curve for staff is gentle, and it’s simple for the kids to use as well.” You can configure your reasons to meet your library’s needs and then collect data on how students are using the library throughout the data.  Here is a sample report I’ve just run today that gives you an idea of how students have been using our library since November 2015:

reasons library trac for the year

As you can see, the majority of our students (nearly 20,000 since we started using the application in mid-November 2015!) come for quiet study or to work on an assignment/homework individually.  This kind of data obviously informs how we will think about student needs for space, furniture, programming, and services in 2016-2017.

Our teachers and administrators love that LibraryTrac provides accountability for student visits as we can easily check who has visited us and when; you can also provide teachers the ability to check this data with a password protected link.

If you are interested in trying out LibraryTrac at no charge for the remaining days of this 2015-16 school year, Scott Allen is offering a free trial at this time!  You can contact him with the information in the flyer included in this post.

Southeastern Library Assessment Conference 2015 Session Roundup: Correlation Between Library Instruction and Student Retention

CC image from http://bit.ly/1HIX8AP
CC image from http://bit.ly/1HIX8AP

Note:  This is the second in a series of posts on sessions I attended at the Southeastern Library Assessment Conference on November 17.  Please see the first post with more background information here.

Session Description

Grand Valley State University Libraries has found a statistically significant positive correlation between librarian-led instruction and student retention. The libraries’ Head of Instructional Services and the university’s Institutional Analysis department worked together to connect library instruction with student-level data and then analyze that data over several academic years. This presentation will outline the data collection process, analysis methods, limitations, and outcomes of a longitudinal study.

Session Notes

The most interesting session we attended  was “Correlation Between Library Instruction and Student Retention” from Mary O’Kelly, Librarian and Head of Instructional Services at Grand Valley State University.  Here are my notes and reflections from one of the best, most interesting, and most thoughtful conference sessions I have ever attended during my career as a librarian.

O’Kelly became the Head of Instructional Services in 2012 and very quickly began thinking about assessment planning.  Major questions she wanted to pursue included:

  • What questions do we have about our instructional program?
  • What data do we need to those questions?
  • How will we get it?
  • Who will analyze and interpret the results?

She also determined that she would need help gathering reliable data and expert help with the data analysis.    She conducted literature reviews in these broad areas:

  1.  Relationships between library services and student retention
  2.   High-impact educational practices
  3.  Practices to impact student retention in positive way

O’Kelly was particularly interested in retention as part of her research because:

  • Retention is a significant issue in higher education.
  • Retention is a priority at her institution and in higher education at large.
  • She found nothing in the literature about information literacy instruction and student retention (though there were other kinds of studies on correlation between non-instructional aspects of library service and correlation).
  • She felt it was a top priority for the library to demonstrate its values and impact on larger institutional goals.
  • She wanted to see if the data would support library instruction and collaboration between faculty and librarians as a high impact practice.

Research Methods

O’Kelly and her staff used LibAnalytics from Springshare to collect library instruction data.   For each guide used with library instruction, staff entered:
A.  the course code, course number, and section in separate fields
B.  The professor name and librarian name
C.  Other:  date, location, duration, content
D.  Attendance was not taken; absences were within a margin of error

This research endeavor presented an opportunity to establish a new relationship with the university’s Office of Institutional Analysis.  O’Kelly worked closely with Rachael Passarelli, the office’s research analyst.  Together they began with 23 questions about the library’s instructional program.  Initially, they used some of these questions to adjust instruction and to think about important data points for instruction:

  • Which programs were not being reached?
  • How many students were reached at each grade level?
  • What percentage of instruction sessions were in the library?
  • What is the distribution of the number of sessions over the course of an academic year?

After developing the of working list of 23 questions and conducting her literature review,  the one that bubbled to the top was focused on the relationship between information literacy instruction and student retention.This initial pass at the data led them to the big research question:

Of the students who saw a librarian in class, what percentage of them re-enrolled for the following fall compared to students who did not see a librarian? 

The null hypothesis:  there is no relationship between library instruction and student retention.  Retention was defined as re-enrollment the following fall semester.

After the data was collected over the course of an academic year, the LibAnalytics dataset and questions were sent to Passarelli and the Office of Institutional Analysis.  Passarelli pulled student records from course enrollment in the LibAnalytics data as part of her analysis.  Only courses with at least one library session were analyzed; she also used the chi-square test of independence using SAS.  A fixed p-value of .05 was used to test significance; a general linear model was used to control for ACT scores, high school GPA, socioecnomic status, and first generation status.  The research project was IRB exempt since privileged student data was stripped before the data analysis was sent back to the library.

Results

The results over three years of data collection showed the following:

data results

As you can see, the findings were replicable over three years and statistically significant.  In addition, the magnitude increased each year.   Consequently, the null hypothesis has been rejected.   Data also shows the highest retention correlation with freshmen and graduate students.  In order to triangulate data and rule out faculty effect, a recent follow-up analysis that compared the retention of students whose faculty had a librarian come into any of their classes compared to faculty who did not (analysis was by faculty, not by student).  This follow-up analysis also showed a significant correlation, p-value=.0001.

Limitations

  • LibAnalytics record input is subject to human error
  • Attendance is estimated
  • Online instruction is excluded
  • Results cannot be generalized to other institutions
  • Retention is not persistence to graduation
  • Reasons why students withdraw are often complicated
  • Correlation is not causation (my note:  I am deeply appreciative of this distinction since so many library studies of recent years crow as though their results are causation when in fact, they are not.)

Discussion/Next Steps

One of the limitations of library use studies is the student motivation factor.  For O’Kelly’s study intrinsic motivation for library engagement is removed because whole-class data was used.  In addition, the large sample size is a strength of this study.  O’Kelly wants to further explore why students are using the library and to consider the role of course assignments (that are given by faculty) in library use.  At this time, the library instruction is very targeted because it is based on the school core curriculum, not the ACRL standards/framework.

Because faculty are including a librarian in assignment design and delivery, they are introducing the library as an academic support service to students.  In light of her research showing faculty engagement with instruction librarians is correlated with student retention and student participation in library instruction is correlated with student retention, O’Kelly now wonders, “What’s the cause?”  She now wants to test this working hypothesis:

Faculty engagement with library instruction is a high impact practice that positively affects student retention.

O’Kelly will be publishing her research in early 2016; I will be share to alert you when her formal study is published.  For now, you can see her slides here.  I was most impressed by the depth of thought and how she tried to cover every angle possible with her research methodology.  As I stated earlier, I also appreciate that she is stresses her research shows correlation, not cause, a distinction I think is often lost in library studies in people’s efforts to advance advocacy agendas.  The other attendees were also clearly impressed with her research methodology, attention to detail, and the clear and deliberate way she communicated her work.    The session left me thinking about how her efforts might inspire my own research as a high school librarian and what data points matter to my local school and learning community.  I hope to write more about where this may lead or go in the spring of 2016.

Sticky Notes as Formative Assessment for Information Literacy Instruction: Coding Student Responses

Yesterday I blogged about our pre-searching activities and the use of sticky notes for some gentle formative assessment.  Today I want to share how I went about coding the student responses not only to get a sense of students’ thinking during the two days of pre-searching, but to also use the data as a baseline of sorts in hopefully looking a broad collection of their work as we try to track their trajectory of growth and progress through this extended research unit.

Coding Information Sources

I began by removing the sticky notes for each period from the whiteboards and affixing them to large post-it notes and labeling each grouping by period and response type.  The next challenge was to think of categories for coding the student responses.  The “information sources used” was the easiest starting point, so I began there.

Coding "Information Sources Used" Sticky Notes from Days 1 and 2 of PreSearch, 3rd Period #rustyq

I listed all the information sources from the LibGuide for the project and then tallied responses.  I wound up adding Google as another category since some students indicated they had used this search engine.  Here are the results by period:

2nd period Rust Sources Used Sticky Note Data PreSearch October 2014

 

3rd  period Rust Sources Used Sticky Note Data PreSearch October 2014

In both classes, it appears Gale Opposing Viewpoints was a starting point for the majority of students; Gale Science in Context was next in popularity.  2nd period seemed to like SweetSearch and self-selected information sources while 3rd period leaned more heavily toward Academic Search Complete.

When we look at the updated topics roster (while taking into account the intiial list of topics they had generated), the numbers are not too surprising.  I know that many of them will benefit from some guidance into specific databases and search tools that will align with their topic choices as we move deeper into the project, but I’m not terribly surprised by what I see from the first two days of the risk free pre-search time to just hone down an interest area for one broad topic.  This data, though, does suggest to me that there may be sources unfamiliar to students or they have used minimally in the past (as do the results from the information literacy skills needs survey we did via index cards with Ms. Rust a few weeks ago).

Questions

My categories for coding the questions students generated included:

  • Who
  • What
  • Where
  • When
  • How or Why?
  • Topic Clarification
  • Question about the research or the assignment
  • Other (other types of questions i.e. Is Finland’s educational system superior to the United States?)
  • None

2nd period posed 15 “how/why” questions and 11 questions that fell under “other”; there were four “who” questions and 6 “what” questions; three students did not note any questions.  3rd period generated questions that primarily fell under “what” (4), “how/why” (4), research/assignment questions (6), or “other” (6); five students did not generate any questions.  Clearly, there is a stark contrast between the two classes in the types of questions they generated.  This data may indicate that 3rd period may need more guided help in engaging more deeply with their articles OR strategies for generating questions.

Discoveries and Insights

For this group of sticky note responses, I created these coding categories:

  • Fact or concrete detail
  • Concept/Conceptual
  • Question
  • Reflection
  • Commentary/Opinion/Reaction

Once I began taking a pass through the student responses, I realized I need four additional categories:

  • Topic Ideas
  • Sources
  • None
  • Other

Second period students primarily recorded facts or concrete details for their notes; however, several used this space to think through additional topic ideas; the pattern was nearly identical in 3rd period.  I was not surprised by these findings since students spent only two days doing light pre-search and I knew in advance that getting enough information to eliminate some topic areas of interest would be where many would expend their time and energy.

Final Thoughts

The pre-search activity and days were designed to help students rule out some topics and have time to explore those of interest and our sticky note method of formative assessment was one we felt would give us feedback without imposing a structure that would be time-consuming for students since we really wanted them to channel their energies into reading and learning more about their topic lists.  While some of the data I coded was not surprising, I was really struck by the differences in the types of questions they generated.  Right now I don’t know if this means one class might need more help in generating questions from informational texts or if perhaps they were approaching the reading and activity in a different way that didn’t lend itself to composing lots of questions at that early juncture in time.

If you are incorporating pre-search as part of the connecting cycle of inquiry, what kinds of formative assessments do you use?  If you code student responses, how do you approach that process, and how do you use that data to inform your instructional design?   I find this kind of work interesting—I am looking forward to seeing if any of these gray areas or perceived gaps come to light as we move further into our research unit this month.

The Unquiet Library Annual Report 2011, Part 1

I’m happy to share with you the first part of my annual library report that I created in Microsoft Word.  Each media center in our district is mandated to submit an annual report; in the past, we shared program highlights related to each of the four roles under Information Power, but I’m excited that we have transitioned to the five roles from Empowering Learners for this year’s report.

I always struggle with finding a balance in the information and data I include as I don’t want to have so much that it overwhelms a reader—it is easy to go overboard on charts/graphs/quotes—but of the four annual reports I’ve written, I think I’m happiest with this year’s edition. I think I’ve improved the narratives and organization this year as well as the use of graphics and media; I also love the Word template I selected to create the report because it looks clean and streamlined yet professional. Each year I try to incorporate a new element into my reports, and this year, I thought it would be fun to incorporate QR codes linking to specific web resources or videos related to a section of the annual report.  In addition, I love the addition of the teacher quote sidebars into this year’s annual report—these “impact” statements were very humbling to read, and I am so grateful to work with colleagues who let me share the joy the joy of teaching and learning with them and their students.  A heartfelt thank you to my faculty who were able to share such powerful statements for inclusion in this year’s annual report.

While writing the annual report is truly a labor of love and quite time intensive for me, I also find it a valuable reflective exercise as it helps me to really connect dots of patterns I’ve observed over the last year, consider what didn’t work as planned and how to approach those challenges in the future, and to see the successes of the program that are so easy to overlook when you’re in the throes of daily library life.  The process also helps me start to crystallize ideas that have been simmering and take initial steps toward writing next year’s program goals/themes for the upcoming year.

I’ll be creating the annual report and video (in the same vein as last year’s) later this week, so look for a followup post here in which I’ll share the multimedia elements of the library annual report!