Note: This is the second in a series of posts on sessions I attended at the Southeastern Library Assessment Conference on November 17. Please see the first post with more background information here.
Grand Valley State University Libraries has found a statistically significant positive correlation between librarian-led instruction and student retention. The libraries’ Head of Instructional Services and the university’s Institutional Analysis department worked together to connect library instruction with student-level data and then analyze that data over several academic years. This presentation will outline the data collection process, analysis methods, limitations, and outcomes of a longitudinal study.
The most interesting session we attended was “Correlation Between Library Instruction and Student Retention” from Mary O’Kelly, Librarian and Head of Instructional Services at Grand Valley State University. Here are my notes and reflections from one of the best, most interesting, and most thoughtful conference sessions I have ever attended during my career as a librarian.
O’Kelly became the Head of Instructional Services in 2012 and very quickly began thinking about assessment planning. Major questions she wanted to pursue included:
- What questions do we have about our instructional program?
- What data do we need to those questions?
- How will we get it?
- Who will analyze and interpret the results?
She also determined that she would need help gathering reliable data and expert help with the data analysis. She conducted literature reviews in these broad areas:
- Relationships between library services and student retention
- High-impact educational practices
- Practices to impact student retention in positive way
O’Kelly was particularly interested in retention as part of her research because:
- Retention is a significant issue in higher education.
- Retention is a priority at her institution and in higher education at large.
- She found nothing in the literature about information literacy instruction and student retention (though there were other kinds of studies on correlation between non-instructional aspects of library service and correlation).
- She felt it was a top priority for the library to demonstrate its values and impact on larger institutional goals.
- She wanted to see if the data would support library instruction and collaboration between faculty and librarians as a high impact practice.
O’Kelly and her staff used LibAnalytics from Springshare to collect library instruction data. For each guide used with library instruction, staff entered:
A. the course code, course number, and section in separate fields
B. The professor name and librarian name
C. Other: date, location, duration, content
D. Attendance was not taken; absences were within a margin of error
This research endeavor presented an opportunity to establish a new relationship with the university’s Office of Institutional Analysis. O’Kelly worked closely with Rachael Passarelli, the office’s research analyst. Together they began with 23 questions about the library’s instructional program. Initially, they used some of these questions to adjust instruction and to think about important data points for instruction:
- Which programs were not being reached?
- How many students were reached at each grade level?
- What percentage of instruction sessions were in the library?
- What is the distribution of the number of sessions over the course of an academic year?
After developing the of working list of 23 questions and conducting her literature review, the one that bubbled to the top was focused on the relationship between information literacy instruction and student retention.This initial pass at the data led them to the big research question:
Of the students who saw a librarian in class, what percentage of them re-enrolled for the following fall compared to students who did not see a librarian?
The null hypothesis: there is no relationship between library instruction and student retention. Retention was defined as re-enrollment the following fall semester.
After the data was collected over the course of an academic year, the LibAnalytics dataset and questions were sent to Passarelli and the Office of Institutional Analysis. Passarelli pulled student records from course enrollment in the LibAnalytics data as part of her analysis. Only courses with at least one library session were analyzed; she also used the chi-square test of independence using SAS. A fixed p-value of .05 was used to test significance; a general linear model was used to control for ACT scores, high school GPA, socioecnomic status, and first generation status. The research project was IRB exempt since privileged student data was stripped before the data analysis was sent back to the library.
The results over three years of data collection showed the following:
As you can see, the findings were replicable over three years and statistically significant. In addition, the magnitude increased each year. Consequently, the null hypothesis has been rejected. Data also shows the highest retention correlation with freshmen and graduate students. In order to triangulate data and rule out faculty effect, a recent follow-up analysis that compared the retention of students whose faculty had a librarian come into any of their classes compared to faculty who did not (analysis was by faculty, not by student). This follow-up analysis also showed a significant correlation, p-value=.0001.
- LibAnalytics record input is subject to human error
- Attendance is estimated
- Online instruction is excluded
- Results cannot be generalized to other institutions
- Retention is not persistence to graduation
- Reasons why students withdraw are often complicated
- Correlation is not causation (my note: I am deeply appreciative of this distinction since so many library studies of recent years crow as though their results are causation when in fact, they are not.)
One of the limitations of library use studies is the student motivation factor. For O’Kelly’s study intrinsic motivation for library engagement is removed because whole-class data was used. In addition, the large sample size is a strength of this study. O’Kelly wants to further explore why students are using the library and to consider the role of course assignments (that are given by faculty) in library use. At this time, the library instruction is very targeted because it is based on the school core curriculum, not the ACRL standards/framework.
Because faculty are including a librarian in assignment design and delivery, they are introducing the library as an academic support service to students. In light of her research showing faculty engagement with instruction librarians is correlated with student retention and student participation in library instruction is correlated with student retention, O’Kelly now wonders, “What’s the cause?” She now wants to test this working hypothesis:
Faculty engagement with library instruction is a high impact practice that positively affects student retention.
O’Kelly will be publishing her research in early 2016; I will be share to alert you when her formal study is published. For now, you can see her slides here. I was most impressed by the depth of thought and how she tried to cover every angle possible with her research methodology. As I stated earlier, I also appreciate that she is stresses her research shows correlation, not cause, a distinction I think is often lost in library studies in people’s efforts to advance advocacy agendas. The other attendees were also clearly impressed with her research methodology, attention to detail, and the clear and deliberate way she communicated her work. The session left me thinking about how her efforts might inspire my own research as a high school librarian and what data points matter to my local school and learning community. I hope to write more about where this may lead or go in the spring of 2016.