For the last few weeks, I have contemplated doing a midyear/end of semester report. While I have done monthly and yearly reports in the past, I have never done one at this point in the academic year. However, I thought it would be a great reflective exercise for me as a school librarian in a new school and new district as well as a great way to share what has been happening with my administrative team. My friend and former NHS colleague Jennifer Lund inspired me to use Piktochart after showing me her fabulous midyear report. Here is my own rendition, which you can view by scrolling or in presentation view by clicking on the presentation icon in the upper right hand corner of the screen.
Today’s guest post is from my friend and former colleague at Norcross High, Dan Byrne. Both Dan and his wife, Dr. Melinda Byrne, are accomplished teachers; Dan was our Teacher of the Year at Norcross HS for 2014-15, and Melinda was one of the finalists for the same award this fall. Collaborative efforts with Dan were featured last year on the blog, and I am delighted he is continuing to integrate pieces of that work with his students. Earlier this semester, Dan listened to his students and their points of need by modifying the large group write-around strategies we had done in 2014-2015. Here are his reflections of that process!
The Mini Write-Around
I teach IB History to highly motivated juniors and seniors. By highly motivated, I mean the kind of students who will read six chapters the night before a test just to make sure they feel confident. IB History is different than many subjects because it rewards a high level of conceptual thinking that is paired with their choice of very specific facts that back up their concepts.
That is why I like “Write-Around” as a strategy. I often give students a quote, an old IB test prompt, or even just a theme and have them add their ideas as they work cooperatively. I find this a non-threatening, fun, change of pace for students to review, build concepts, or practice skills of supporting or refuting ideas.
Students sometimes remark that they enjoy Write-Around, but that they wish they could take all the ideas with them. (Apparently, the iPhone photos that so many kids take of content in the classrooms are never looked at again.) Because of this, I decided that I would try to shrink my Write Around by writing on a standard-size piece of copy paper instead of using a display-sized sheet. Here is what we did:
- Students initially walked around and responded just like a regular write around. However, they soon decided it was more efficient to pass them from desk to desk, so they passed the sheets. (Like I said earlier, the challenge was the amount of information they had in their heads). In my smaller classes, students worked in pairs; in larger classes, students worked in groups of four. Students were writing significant ideas/themes/facts for aspects of WWI. The kids wanted this information/ideas to keep for review after class ended.
- Students then engaged in small and whole group discussion.
- After the class discussions, students “starred” the comments from the write-around they felt were notable, exemplary, or important in some way.
Once students had completed the activities , I copied their Write-Arounds so each student could have a packet of copies of their peers’ responses.
Students liked the Mini Write Around because they felt “less pressured” to get all the information down (sometimes these kids lose the forest for the trees). They also liked that the paper fit on a standard desk. We were also able to have more “writing stations” than we would with the traditional write-around. The only drawback was that some students remarked that they didn’t have enough space to develop their ideas.
My main complaint about this modified approach was that students got very “facty” on this assignment. I think this was partially due to the paper size and partly due to the fact that I did not have good prompts for them to build concepts around. I think another strategy to try would be to provide students with the opportunity to spend time taking notes at the end of the activity. That would force them to distill the ideas on their own rather than depending on me to give them a shotgun approach. Another modification for the future: groups of writers need to be smaller so everyone can see the paper; I also feel that three is the ideal size for my kids so that you have enough to generate discussion but not so many that they are butting heads. I also need to give them more time to write (I think I thought, “less paper, less time”). Last but not least, the students need to write in ink so the copier clearly picks up what they wrote.
It’s Your Turn
How are you all integrating and modifying written conversation strategies to meet your students’ needs? Please share your experiences and variations in the comments below!
Note: This is the second in a series of posts on sessions I attended at the Southeastern Library Assessment Conference on November 17. Please see the first post with more background information here.
Grand Valley State University Libraries has found a statistically significant positive correlation between librarian-led instruction and student retention. The libraries’ Head of Instructional Services and the university’s Institutional Analysis department worked together to connect library instruction with student-level data and then analyze that data over several academic years. This presentation will outline the data collection process, analysis methods, limitations, and outcomes of a longitudinal study.
The most interesting session we attended was “Correlation Between Library Instruction and Student Retention” from Mary O’Kelly, Librarian and Head of Instructional Services at Grand Valley State University. Here are my notes and reflections from one of the best, most interesting, and most thoughtful conference sessions I have ever attended during my career as a librarian.
O’Kelly became the Head of Instructional Services in 2012 and very quickly began thinking about assessment planning. Major questions she wanted to pursue included:
- What questions do we have about our instructional program?
- What data do we need to those questions?
- How will we get it?
- Who will analyze and interpret the results?
She also determined that she would need help gathering reliable data and expert help with the data analysis. She conducted literature reviews in these broad areas:
- Relationships between library services and student retention
- High-impact educational practices
- Practices to impact student retention in positive way
O’Kelly was particularly interested in retention as part of her research because:
- Retention is a significant issue in higher education.
- Retention is a priority at her institution and in higher education at large.
- She found nothing in the literature about information literacy instruction and student retention (though there were other kinds of studies on correlation between non-instructional aspects of library service and correlation).
- She felt it was a top priority for the library to demonstrate its values and impact on larger institutional goals.
- She wanted to see if the data would support library instruction and collaboration between faculty and librarians as a high impact practice.
O’Kelly and her staff used LibAnalytics from Springshare to collect library instruction data. For each guide used with library instruction, staff entered:
A. the course code, course number, and section in separate fields
B. The professor name and librarian name
C. Other: date, location, duration, content
D. Attendance was not taken; absences were within a margin of error
This research endeavor presented an opportunity to establish a new relationship with the university’s Office of Institutional Analysis. O’Kelly worked closely with Rachael Passarelli, the office’s research analyst. Together they began with 23 questions about the library’s instructional program. Initially, they used some of these questions to adjust instruction and to think about important data points for instruction:
- Which programs were not being reached?
- How many students were reached at each grade level?
- What percentage of instruction sessions were in the library?
- What is the distribution of the number of sessions over the course of an academic year?
After developing the of working list of 23 questions and conducting her literature review, the one that bubbled to the top was focused on the relationship between information literacy instruction and student retention.This initial pass at the data led them to the big research question:
Of the students who saw a librarian in class, what percentage of them re-enrolled for the following fall compared to students who did not see a librarian?
The null hypothesis: there is no relationship between library instruction and student retention. Retention was defined as re-enrollment the following fall semester.
After the data was collected over the course of an academic year, the LibAnalytics dataset and questions were sent to Passarelli and the Office of Institutional Analysis. Passarelli pulled student records from course enrollment in the LibAnalytics data as part of her analysis. Only courses with at least one library session were analyzed; she also used the chi-square test of independence using SAS. A fixed p-value of .05 was used to test significance; a general linear model was used to control for ACT scores, high school GPA, socioecnomic status, and first generation status. The research project was IRB exempt since privileged student data was stripped before the data analysis was sent back to the library.
The results over three years of data collection showed the following:
As you can see, the findings were replicable over three years and statistically significant. In addition, the magnitude increased each year. Consequently, the null hypothesis has been rejected. Data also shows the highest retention correlation with freshmen and graduate students. In order to triangulate data and rule out faculty effect, a recent follow-up analysis that compared the retention of students whose faculty had a librarian come into any of their classes compared to faculty who did not (analysis was by faculty, not by student). This follow-up analysis also showed a significant correlation, p-value=.0001.
- LibAnalytics record input is subject to human error
- Attendance is estimated
- Online instruction is excluded
- Results cannot be generalized to other institutions
- Retention is not persistence to graduation
- Reasons why students withdraw are often complicated
- Correlation is not causation (my note: I am deeply appreciative of this distinction since so many library studies of recent years crow as though their results are causation when in fact, they are not.)
One of the limitations of library use studies is the student motivation factor. For O’Kelly’s study intrinsic motivation for library engagement is removed because whole-class data was used. In addition, the large sample size is a strength of this study. O’Kelly wants to further explore why students are using the library and to consider the role of course assignments (that are given by faculty) in library use. At this time, the library instruction is very targeted because it is based on the school core curriculum, not the ACRL standards/framework.
Because faculty are including a librarian in assignment design and delivery, they are introducing the library as an academic support service to students. In light of her research showing faculty engagement with instruction librarians is correlated with student retention and student participation in library instruction is correlated with student retention, O’Kelly now wonders, “What’s the cause?” She now wants to test this working hypothesis:
Faculty engagement with library instruction is a high impact practice that positively affects student retention.
O’Kelly will be publishing her research in early 2016; I will be share to alert you when her formal study is published. For now, you can see her slides here. I was most impressed by the depth of thought and how she tried to cover every angle possible with her research methodology. As I stated earlier, I also appreciate that she is stresses her research shows correlation, not cause, a distinction I think is often lost in library studies in people’s efforts to advance advocacy agendas. The other attendees were also clearly impressed with her research methodology, attention to detail, and the clear and deliberate way she communicated her work. The session left me thinking about how her efforts might inspire my own research as a high school librarian and what data points matter to my local school and learning community. I hope to write more about where this may lead or go in the spring of 2016.