research

When Less Is More: Discovering Student Points of Need with Small Group Conversation

blurry

As any classroom teacher knows, time is a valuable commodity.   It’s always a struggle to squeeze every last drop of the instructional time we have with students and still provide meaningful learning experiences.  One of our English teachers, Kim Cooney, recognized she needed a way to negotiate two major class activities with her 10th grade students:

  1.  She needed to have small seminar discussions with students.
  2.  She needed for students to have some instruction on EasyBib and research databases for a project in which students are investigating issues related to social media.

Ms. Cooney asked me if I would be comfortable working with half of her class in the media center while she did seminar with the other half in her classroom.  I immediately said yes, and we scheduled two days with her 1st, 2nd, and 4th periods to “flip” between us.    I was super excited about being able to work with a smaller group of students as it feels more personal, and I think students get more from that setting than they sometimes do with an especially large class.

A series of events over the last 24 hours helped me craft a better approach to our mini-lessons today.  I realized after school yesterday we didn’t have enough computers available (our lab was already booked) for all sections to do some hands-on work after the mini-lesson.  I then arrived at work to this morning and learned Ms. Cooney was very sick and that a substitute teacher had not been found.   Our fantastic department head, David White, and I discussed options and we agreed to move forward with the small group plans as scheduled.  He and fellow English teachers stepped in to facilitate the seminar “speed dating” discussion style while the other class half came here for their instruction.

After wondering what to do in lieu of no computers, I decided on the fly that kicking off the mini-lessons with a conversation was the best course of action.    I quickly drafted a graphic organizer for students—-this served the purpose of them jotting down answers to these two questions as well as taking brief notes:

  1.  What topic(s) are you thinking about? ( I made it clear it was OK if they had not picked one or had time to think about it just yet)
  2.   What gives you the most difficulty when doing a research assignment?

We met in our small group area I organized this morning and students had a few minutes to jot down their responses.  We then did a whole-group conversation with each student sharing his/her responses to those two questions.  Not only did this give me a chance to get to know the students a little, but I think it also give an element of humanity to the experience, especially since I had not seen most of these students until today.

Here are some of the challenges students identified; I have boldfaced the ones that bubbled up most frequently.

  • Getting started or knowing how/where to start
  • Staying on task/dealing with distractions
  • Procrastination
  • Finding valid and credible sources and knowing that they are such
  • Finding relevant resources (to the research topic)
  • Search terms
  • Managing citations (EasyBib to the rescue!)
  • Knowing which sources to use (MackinVIA groups FTW along with LibGuides)
  • Knowing how to use the databases
  • Keeping up with notes/organizing notes
  • Pacing self through the project

We took time to talk about each student’s challenges as I wanted to be sure to validate and honor each area of concern.   This discussion was a perfect springboard to our research guide and how the resources there and the mini-lessons from today would help mitigate and address many of those concerns.  We also talked about how their responses would help me shape future conversations with teachers about research assignment design, especially with pieces like more formative assessments to help keep everyone on track and take the “pulse” of student progress (and not in a punitive way) as well as more time in-class to do hands-on work.  We also talked about possibilities for more collaboration as part of research projects and perhaps birds of feather groups to meet periodically to share successes and challenges (this was super helpful for my Media 21 students a few years ago).

The feedback also helped me collect informal data that might help me sway teachers to build in more time for topic selection with activities like reading frenzies or Think/Extend/Challenge.  These activities encourage inquiry and give students some concrete starting points to get ideas for topics or to introduce topic ideas that might not be on their radar.

FSCN2679

In hindsight, this activity seems like it should have been an obvious starting point; I honestly feel a bit sheepish I didn’t initially plan to do this as part of the instructional time today,  but I’m glad it came to me on the fly this morning.    Sometimes we get so busy that we forget the ultimate starting point is the student point of need, especially if we as librarians get caught up in trying to work within a very limited amount of scheduled time with students.   I am excited to listen to what the kids have to say when I see the next round of small groups tomorrow as we “flip” students and engage in “research chats”!

Southeastern Library Assessment Conference 2015 Session Roundup: Correlation Between Library Instruction and Student Retention

CC image from http://bit.ly/1HIX8AP

CC image from http://bit.ly/1HIX8AP

Note:  This is the second in a series of posts on sessions I attended at the Southeastern Library Assessment Conference on November 17.  Please see the first post with more background information here.

Session Description

Grand Valley State University Libraries has found a statistically significant positive correlation between librarian-led instruction and student retention. The libraries’ Head of Instructional Services and the university’s Institutional Analysis department worked together to connect library instruction with student-level data and then analyze that data over several academic years. This presentation will outline the data collection process, analysis methods, limitations, and outcomes of a longitudinal study.

Session Notes

The most interesting session we attended  was “Correlation Between Library Instruction and Student Retention” from Mary O’Kelly, Librarian and Head of Instructional Services at Grand Valley State University.  Here are my notes and reflections from one of the best, most interesting, and most thoughtful conference sessions I have ever attended during my career as a librarian.

O’Kelly became the Head of Instructional Services in 2012 and very quickly began thinking about assessment planning.  Major questions she wanted to pursue included:

  • What questions do we have about our instructional program?
  • What data do we need to those questions?
  • How will we get it?
  • Who will analyze and interpret the results?

She also determined that she would need help gathering reliable data and expert help with the data analysis.    She conducted literature reviews in these broad areas:

  1.  Relationships between library services and student retention
  2.   High-impact educational practices
  3.  Practices to impact student retention in positive way

O’Kelly was particularly interested in retention as part of her research because:

  • Retention is a significant issue in higher education.
  • Retention is a priority at her institution and in higher education at large.
  • She found nothing in the literature about information literacy instruction and student retention (though there were other kinds of studies on correlation between non-instructional aspects of library service and correlation).
  • She felt it was a top priority for the library to demonstrate its values and impact on larger institutional goals.
  • She wanted to see if the data would support library instruction and collaboration between faculty and librarians as a high impact practice.

Research Methods

O’Kelly and her staff used LibAnalytics from Springshare to collect library instruction data.   For each guide used with library instruction, staff entered:
A.  the course code, course number, and section in separate fields
B.  The professor name and librarian name
C.  Other:  date, location, duration, content
D.  Attendance was not taken; absences were within a margin of error

This research endeavor presented an opportunity to establish a new relationship with the university’s Office of Institutional Analysis.  O’Kelly worked closely with Rachael Passarelli, the office’s research analyst.  Together they began with 23 questions about the library’s instructional program.  Initially, they used some of these questions to adjust instruction and to think about important data points for instruction:

  • Which programs were not being reached?
  • How many students were reached at each grade level?
  • What percentage of instruction sessions were in the library?
  • What is the distribution of the number of sessions over the course of an academic year?

After developing the of working list of 23 questions and conducting her literature review,  the one that bubbled to the top was focused on the relationship between information literacy instruction and student retention.This initial pass at the data led them to the big research question:

Of the students who saw a librarian in class, what percentage of them re-enrolled for the following fall compared to students who did not see a librarian? 

The null hypothesis:  there is no relationship between library instruction and student retention.  Retention was defined as re-enrollment the following fall semester.

After the data was collected over the course of an academic year, the LibAnalytics dataset and questions were sent to Passarelli and the Office of Institutional Analysis.  Passarelli pulled student records from course enrollment in the LibAnalytics data as part of her analysis.  Only courses with at least one library session were analyzed; she also used the chi-square test of independence using SAS.  A fixed p-value of .05 was used to test significance; a general linear model was used to control for ACT scores, high school GPA, socioecnomic status, and first generation status.  The research project was IRB exempt since privileged student data was stripped before the data analysis was sent back to the library.

Results

The results over three years of data collection showed the following:

data results

As you can see, the findings were replicable over three years and statistically significant.  In addition, the magnitude increased each year.   Consequently, the null hypothesis has been rejected.   Data also shows the highest retention correlation with freshmen and graduate students.  In order to triangulate data and rule out faculty effect, a recent follow-up analysis that compared the retention of students whose faculty had a librarian come into any of their classes compared to faculty who did not (analysis was by faculty, not by student).  This follow-up analysis also showed a significant correlation, p-value=.0001.

Limitations

  • LibAnalytics record input is subject to human error
  • Attendance is estimated
  • Online instruction is excluded
  • Results cannot be generalized to other institutions
  • Retention is not persistence to graduation
  • Reasons why students withdraw are often complicated
  • Correlation is not causation (my note:  I am deeply appreciative of this distinction since so many library studies of recent years crow as though their results are causation when in fact, they are not.)

Discussion/Next Steps

One of the limitations of library use studies is the student motivation factor.  For O’Kelly’s study intrinsic motivation for library engagement is removed because whole-class data was used.  In addition, the large sample size is a strength of this study.  O’Kelly wants to further explore why students are using the library and to consider the role of course assignments (that are given by faculty) in library use.  At this time, the library instruction is very targeted because it is based on the school core curriculum, not the ACRL standards/framework.

Because faculty are including a librarian in assignment design and delivery, they are introducing the library as an academic support service to students.  In light of her research showing faculty engagement with instruction librarians is correlated with student retention and student participation in library instruction is correlated with student retention, O’Kelly now wonders, “What’s the cause?”  She now wants to test this working hypothesis:

Faculty engagement with library instruction is a high impact practice that positively affects student retention.

O’Kelly will be publishing her research in early 2016; I will be share to alert you when her formal study is published.  For now, you can see her slides here.  I was most impressed by the depth of thought and how she tried to cover every angle possible with her research methodology.  As I stated earlier, I also appreciate that she is stresses her research shows correlation, not cause, a distinction I think is often lost in library studies in people’s efforts to advance advocacy agendas.  The other attendees were also clearly impressed with her research methodology, attention to detail, and the clear and deliberate way she communicated her work.    The session left me thinking about how her efforts might inspire my own research as a high school librarian and what data points matter to my local school and learning community.  I hope to write more about where this may lead or go in the spring of 2016.

Southeastern Library Assessment Conference 2015: Introduction and Space Assessment Session 1

library asssessment conf

My friend and former Norcross High colleague Jennifer Lund and I attended the Southeastern Library Assessment Conference on November 16 that was held at the historic Georgian Terrace Hotel in Atlanta, Georgia.  Though we were probably the only school librarians there, we felt welcome and gleaned many pearls of wisdom from the sessions we attended.  I was sadly only able to attend Day 1 (Monday, 11/16) due to district meetings I needed to attend on the second day (11/17), but I got MORE than my money’s worth from the sessions I attended.  I highly recommend this conference if you are looking for smart, thoughtful perspectives that are grounded in evidence based practice and data collection with integrity.  The conference was limited to 125 people and had a pleasant, intimate feel; in addition, we were served a gourmet lunch buffet (it was fabulous) and many delicious amenities throughout the day (Starbucks coffee, tea, water, sodas, cookies).  Many thanks to the conference organizers who did a fantastic job with every aspect of the conference—it is by far one of the best and most meaningful conference experiences I’ve had in my career—every session had substance.

This is the first in a series of posts on the sessions Jennifer and I attended on Monday, November 16, 2015.

Space Assessment: How They Use It, What They Want, Sara DeWaay, University of North Carolina, Charlotte

Session Description:  Getting student input on the library space can be a multi-layered effort. Come hear about the methods used to get an understanding of use patterns, as well as the students’ desires for a small branch library, as we work to transition towards a flexible space.

My Notes:
The emphasis was on users and feedback from students; Sara thought about the feedback in terms of “low cost easy” vs. “high cost hard” solutions and ideas from the students.  When she began the group study, she thought of the library space in zones:  group study, circulation area, lounge, quiet study, flexible, and creativity.  She began by doing a literature review on space assessment, and she focused on both qualitative and quantitative assessment methods.  She also looked at space assessment from a “before” and “afterwards” perspective since assessment should continue after the space remodel or redesign is initially completed.  She also did research on user centered design.  She formed a Student Advisory group; positive aspects of this group included input, support, connection, and ownership for the students, but challenges were maintaining momentum and a sustained sense of meaningfulness for the students after their participation ended.  In the future, Sara would try to make sure students received some sort of course credit for participation, perhaps as part of a project based learning assignment related to space design.

She organized a student event where students could come and vote on designs; approximately 40-50 students participated.  She basically used big notepads where students could vote with sticky notes on larger sheets of bulletin board or flip chart paper housed on easels.  For example:

space-assessment

She also used flip charts to get feedback from students using open-ended questions; she interspersed the flip charts with the buffet of food to “guide” them to this part of the feedback session.    Students also had a chance to mark up floor plans; she provided them a variety of tools for doing this activity including crayons, sharpies, ballpoint pens, colored pencils, and regular pencils.  Students then could tape their proposed floor plan on the wall.  Afterwards, she coded the feedback from the student floor plans using categories like “atmosphere” (and specific elements assigned something like letters A-J) and “physical space” (specific aspects were numbered 1-14).  This method of floor plan coding then allowed her to look at the data in a “layered” way (example:  2B).

Another strategy was student surveys.  Unfortunately, her sample size of 40 was not ideal, but nonetheless, she was able to ask more detailed questions about services as well as questions about the library in comparison to other spaces in the building.  She also had library student assistants help track space use; using iPads and Suma, they were able to gather data and plug it into LibAnalytics to get a better idea of space usage.

Once she looked at all the data, she was able to better understand student needs and could classify possible changes and redesign elements into these categories:

  • Low cost/easy to do
  • Low-cost/difficult to do
  • High cost/easy to do
  • High cost/ difficult to do

Unfortunately, the budget for the renovation was put on hold, but if it moves forward, Sara would get faculty input in the future and do similar activities with staff.  The major takeaway for me from this session was the idea of space assessment as cyclical—it should be ongoing and is important to do even after you complete a renovation or redesign project to make sure the new space is continuing to work for students or to see what areas of new need/adjustment may be needed.  This idea was especially helpful for Jennifer and me since she has opened a new library space, and I’m in the middle of working on a redesign project for the library here at Chattahoochee High.

My next post will be about the second session we attended on battling survey fatigue.