Sticky Notes as Formative Assessment for Information Literacy Instruction: Coding Student Responses

Yesterday I blogged about our pre-searching activities and the use of sticky notes for some gentle formative assessment.  Today I want to share how I went about coding the student responses not only to get a sense of students’ thinking during the two days of pre-searching, but to also use the data as a baseline of sorts in hopefully looking a broad collection of their work as we try to track their trajectory of growth and progress through this extended research unit.

Coding Information Sources

I began by removing the sticky notes for each period from the whiteboards and affixing them to large post-it notes and labeling each grouping by period and response type.  The next challenge was to think of categories for coding the student responses.  The “information sources used” was the easiest starting point, so I began there.

Coding "Information Sources Used" Sticky Notes from Days 1 and 2 of PreSearch, 3rd Period #rustyq

I listed all the information sources from the LibGuide for the project and then tallied responses.  I wound up adding Google as another category since some students indicated they had used this search engine.  Here are the results by period:

2nd period Rust Sources Used Sticky Note Data PreSearch October 2014

 

3rd  period Rust Sources Used Sticky Note Data PreSearch October 2014

In both classes, it appears Gale Opposing Viewpoints was a starting point for the majority of students; Gale Science in Context was next in popularity.  2nd period seemed to like SweetSearch and self-selected information sources while 3rd period leaned more heavily toward Academic Search Complete.

When we look at the updated topics roster (while taking into account the intiial list of topics they had generated), the numbers are not too surprising.  I know that many of them will benefit from some guidance into specific databases and search tools that will align with their topic choices as we move deeper into the project, but I’m not terribly surprised by what I see from the first two days of the risk free pre-search time to just hone down an interest area for one broad topic.  This data, though, does suggest to me that there may be sources unfamiliar to students or they have used minimally in the past (as do the results from the information literacy skills needs survey we did via index cards with Ms. Rust a few weeks ago).

Questions

My categories for coding the questions students generated included:

  • Who
  • What
  • Where
  • When
  • How or Why?
  • Topic Clarification
  • Question about the research or the assignment
  • Other (other types of questions i.e. Is Finland’s educational system superior to the United States?)
  • None

2nd period posed 15 “how/why” questions and 11 questions that fell under “other”; there were four “who” questions and 6 “what” questions; three students did not note any questions.  3rd period generated questions that primarily fell under “what” (4), “how/why” (4), research/assignment questions (6), or “other” (6); five students did not generate any questions.  Clearly, there is a stark contrast between the two classes in the types of questions they generated.  This data may indicate that 3rd period may need more guided help in engaging more deeply with their articles OR strategies for generating questions.

Discoveries and Insights

For this group of sticky note responses, I created these coding categories:

  • Fact or concrete detail
  • Concept/Conceptual
  • Question
  • Reflection
  • Commentary/Opinion/Reaction

Once I began taking a pass through the student responses, I realized I need four additional categories:

  • Topic Ideas
  • Sources
  • None
  • Other

Second period students primarily recorded facts or concrete details for their notes; however, several used this space to think through additional topic ideas; the pattern was nearly identical in 3rd period.  I was not surprised by these findings since students spent only two days doing light pre-search and I knew in advance that getting enough information to eliminate some topic areas of interest would be where many would expend their time and energy.

Final Thoughts

The pre-search activity and days were designed to help students rule out some topics and have time to explore those of interest and our sticky note method of formative assessment was one we felt would give us feedback without imposing a structure that would be time-consuming for students since we really wanted them to channel their energies into reading and learning more about their topic lists.  While some of the data I coded was not surprising, I was really struck by the differences in the types of questions they generated.  Right now I don’t know if this means one class might need more help in generating questions from informational texts or if perhaps they were approaching the reading and activity in a different way that didn’t lend itself to composing lots of questions at that early juncture in time.

If you are incorporating pre-search as part of the connecting cycle of inquiry, what kinds of formative assessments do you use?  If you code student responses, how do you approach that process, and how do you use that data to inform your instructional design?   I find this kind of work interesting—I am looking forward to seeing if any of these gray areas or perceived gaps come to light as we move further into our research unit this month.

Connected Learning and Implications for Libraries as Spaces and Mentors for Learning

“Connected learning is realized when a young person is able to pursue a personal interest or passion with the support of friends and caring adults, and is in turn able to link this learning and interest to academic achievement, career success, or civic engagement.”
from Connected Learning:  An Agenda for Research and Design

For the last month or so, I’ve been dwelling in Connected Learning:  An Agenda for Research and Design, a research synthesis report that outlines the research and findings of the Connected Learning Research Network, a group chaired by Dr. Mimi Ito.  In addition to the report, I’ve enjoyed the series of recent webinars centered around the report:

Supplementary readings have also informed my understanding of this report:

Additional definitions and explanations can be found here; the infographic embedded here is also a helpful visualization.

In “Connected Learning:  An Agenda for Social Change”, Dr. Ito asserts that connected learning:

“…is not about any particular platform, technology, or teaching technique, like blended learning or the flipped classroom or Khan Academy or massive open online courses. It’s agnostic about the method and content area. Instead, it’s about asking what is the optimal experience for each learner and for a high-functioning learning community?”

In the Connected Learning:  An Agenda for Research and Design report, the authors describe connected learning as a design model:

“Our approach draws on sociocultural learning theory in valuing learning that is embedded within meaningful practices and supportive relationships, and that recognizes diverse pathways and forms of knowledge and expertise. Our design model builds on this approach by focusing on supports and mechanisms for building environments that connect learning across the spheres of interests, peer culture, and academic life. We propose a set of design features that help build shared purpose, opportunities for production, and openly networked resources and infrastructure” (5).

I’ve recreated this visualization embedded in the report to provide another way of looking at connected learning and thinking about how this model seeks to “knit” together the contexts of peer-supported, interest powered, and academically oriented for learning (12):

Slide1

I’m still coding and organizing my notes from the report as I try to pull out the big takeaways for me, but as I review these notes and the ones I took from the webinar on assessing connected learning outcomes last week, I’m thinking about this first wave of big ideas and questions:

  • How do libraries develop learning agendas that are aligned with agendas for social change in their community?  How do the two inform each other?
  • How can libraries embrace this approach to designing learning environments to help us move from “nice to necessary?”, a question that was posed at ALA Midwinter in 2013, and that I’m attempting to flesh out in my work here as a Learning Strategist at Cleveland Public Library (and that I hope to share with you later this year).
  • How do we create learning environments and experiences as well as relationships with those we serve to move beyond the initial “sweet spot” of attachment to building a deeper level of engagement?  How do we as librarians (with the help of our community) design learning environments that provide diverse entry points and access for people to form communities of learning where they can create more nuanced narratives of learning as they create, share, and connect with others?  How do we design learning spaces and experiences that create more “pathways to opportunity” and participation?
  • How might libraries of all kinds serve as an “open network” that is a medium and a mentor to helping people connect and move more meaningfully across multiple learning spaces and spheres within their local community as well as a larger and more global community of learners?  Kris Gutierrez’s metaphor of “learning as movement” across many kinds of contexts has spurred this thinking.
  • Kris Gutierrez and Bill Penuel discussed concepts of horizontal learning and boundary crossing in their webinar and explored the question of how do we help people leverage the practices, disposition, and expertises honed in one learning space to another to go deeper with that learning and expand the possibilities for action and participation.  How do libraries support communities of learning in engaging in this boundary crossing and engaging in horizontal learning to build greater personal as well as civic capacity?
  • Both Gutierrez and Penuel emphasized the need to further contemplate and explore individual and collective assessment of these practices.  In the words of Dr. Gutierrez, “What tools, dispositions, practices, forms of expertises TRAVEL and how do we know it when we see it?”  I’m also thinking about how we frame formative and summative assessments as touchpoints for learning.
  • How can librarians help people take deep “vertical knowledge” in a specific content area and apply it across multiple learning contexts and spaces?  This question relates to horizontal learning and boundary crossing.  I like to think of these concepts as cross-pollination of ideas and learning.
  • How do more effectively build vocabulary for this kind of learning in our learning communities?
  • How do we more effectively thread and address issues of equity across our instructional design and assessment processes?
  • How do libraries cultivate deeper and more meaningful partnerships and connections with other institutions of learning in their communities for more strategic impact?
  • How do we as librarians facilitate the creation of sustained networks to help people make connections between social, academic, and interest driven learning? ( see page pp.46-47 in the report for more on this question)

As you can see, these learning and design principles as well as the findings and concerns shared in the report have saturated my thinking.  As I make additional readings and passes through my notes from the report, I will continue to take an inquiry stance to further unpack the concepts and language embedded in this work.  I’ll also revisit the case studies included in the report to further develop ideas on what this work could look like in practice in different library settings.  In addition, I will carve out more time to listen as well as contribute to conversations about connected learning in the NWP study group as well as the Connected Learning Google Plus group.

Teachers as Learners Conference Keynote and Concurrent Session Presentations

I want to thank the wonderful educators of the Griffin-Spalding County School System in Griffin, Georgia for inviting me to present a keynote speech and four concurrent sessions this past week at their Teachers as Learners Conference.  Below are resources from two days of learning and sharing!

Peer Review of Digital Research Projects, Spring 2011

We are in the final week of our digital research project that the Media 21 students have been engaged in now for about six weeks as they have investigated issues facing our veterans who have served or who are currently serving in Iraq and Afghanistan.  We began the first of four days of peer review of the projects yesterday; each day, each collaborative research/inquiry group is assigned a fellow group to evaluate using the form embedded below.  Susan Lester, my co-teacher, and I are looking forward to reading student feedback and then debriefing the results of that feedback with each group (we will keep the individual feedback forms private so that students feel free to provide honest and constructive assessment/evaluation).

Are you incorporating peer review into any aspect of your research projects that you facilitate?  If so, how do you go integrate peer review as an assessment and reflective learning experience?

Creating Conversations for Learning: NoodleBib Assignment Dropbox as Formative Assessment, Part 2

A few weeks ago, I composed a post about the possibilities of using the NoodleBib shared assignment dropbox feature as a formative assessment for evaluating working bibliographies and notecards.  Now that I’ve completed two research project “checkpoints” using the shared assignment dropbox, I am happy to share that this formative assessment has been successful in:

1.  Gaining insight into the selection of information sources by students and to help them identify gaps in sources they may be overlooking that could inform their research.

2.  Helping students identify and understand the mistakes they’ve made in the citation process and working with them to correct the entries.

3.  Seeing what students are doing really well with their notetaking skills and providing positive feedback while identifying areas of weakness and then engaging in a conversation for learning with the student by sharing strategies for tackling those “challenge” areas with notetaking skills.

My roles in facilitating these formative assessments included:

1.  Setting up the shared assignment dropboxes.

2.  Teaching students how to share an assignment and confirming I had received the assignments from each group.

3.  Taking the time to evaluate each group’s bibliographic entries and notecards while providing feedback.

4.  Keeping a spreadsheet of general notes for each group’s work and noting patterns in what students were doing well and common problems I saw in student work.

5.  Sharing my findings and notes with my co-teacher, Susan Lester, and then the two of us working together with groups to address challenges I identified through the formative assessment; in addition, I enlisted the assistance of students who were demonstrating specific skills in an exemplary manner to help peers on an “as needed” basis.

I love how easy it is to evaluate bibliographic individual entries and the accompanying notecards for each source cited in one screen.  All you have to do is log into your account, scroll down to the bottom of your project lists page, and then open a student project (which for these assignments, were collaboratively created lists for group research projects).  You can then click on “Bibliography” to access the bibliographic entries and accompanying notecards on one screen; you can then enter custom comments for each entry, and for the notecards,  you can compose custom comments or use a comment from the pre-existing database of notecard comments.  You can see when each entry and notecard were created as well as time/date of any revisions a student may have made.  Take a look at how easy it is to work with the interface in the screenshots below (please note student names have been removed to protect their privacy).

Figure 1: Bibliographic Entry Comments

Figure 2: Comments on Electronic Notecards

I absolutely love using the shared assignment dropbox for formative assessment of student work and using the feedback with students to initiate or sustain conversations for learning.  Here are a few features I’d suggest to make the shared assignment dropbox in NoodleBib even better:

  • add the ability to message a group or comment on overall project
  • add the ability for teachers and librarians to create their own banks of custom comments to both the notecards as well as bibliographic entries
  • add a spellchecker on the teacher/librarian side to spellcheck comments
  • add the ability for the librarian or teacher to “like” a student bibliographic entry or notecard (a la Facebook style)
  • add the ability to create threaded discussions Facebook or new Google Docs discussions style so that students and the teacher and/or librarian can engage in a virtual discussion about the feedback provided (think ramped-up commenting!)
If you haven’t tried the shared dropbox in NoodleBib, I encourage you to give it a try as a way to embed yourself in the classroom with teachers and students as part of your collaborative partnerships and to participate meaningfully in assessment of student work.  If you have tried the shared assignment dropbox as formative assessment, what features did you like, or what enhancements would you like to see added for 2011-12?