Assessment

Holistic and Individualized Formative Assessment of Research and Inquiry Processes

For the last two weeks, our students have been immersed in investigating information and constructing new understandings as they have been composing their research design proposals, revising sections of their proposals, and doing additional research after focusing and narrowing their topics and research questions.  As they have gone back and forth in refining their topics and questions and doing the subsequent additional research, we’ve seen our students move back and forth between confusion/doubt/uncertainty/discomfort and clarity.  Most students are not used to doing this sort of deep dive with a topic, making their own choices about the topic and research questions, articulating how they demonstrating growth in their learning, or selecting their learning products; consequently, the messiness of choice and ownership of their projects has been a new experience (and uncomfortable to varying extents) for them.  Sarah Rust and I have also experienced a spectrum of emotions in this inquiry process as well;  we know our students will grow from these experiences yet we too feel some of that same uncertainty and frustration as our students when they wobble or stall in spite of our efforts to scaffold and support with individualized feedback, resources, and reflective questioning.   Like our colleagues Heather, Meg, Marci, and Cathy, we provide  them strategies and feedback that will propel them forward and give them the tools to self-help, but as we have told them, we cannot make the decisions for them or give them the answers.   We stay calm and reiterate that we are focused on how and what they are learning, not grades—of course, this discourse is a departure from the narrative they have heard their entire school lives in our test-driven culture.

The individualized and fluid nature of working with 50+ students who are all doing different topics is also a newer experience for us and definitely for the students.   Over the last two weeks, any given day has been a potpourri of joy, exasperation, delight, and doubt as students have drafted their research design proposals for their multigenre projects.  This kind of work is where the collaborative partnership Sarah and I have is critical because you have an instructional partner to be responsive to these kinds of learning experiences and individual student needs.  Because we both bring different strengths to the table and can process what we are observing with student work together, we are much better positioned to truly help our students than if we were doing this in a solitary or prescriptive, rigid way.

After receiving the drafts of their proposals for their projects, Sarah and I have employed a variety of strategies to personalize the feedback for each student at their points of need.  Here are some of the action steps we’ve tried:

Individualizing and Capturing Feedback Through Mail Merge and Databases

I created a database in Word of all of our students in 2nd and 3rd periods.  Data fields I created included:

  • First name
  • Last name
  • Class Period
  • Comments About the Narrowed/Focused Topic
  • Multigenre Products Students Selected
  • Publishing Platform of Virtual or Paper (Word/PDF)
  • General Notes (comments about student self-selected learning targets, what they know about their topic at this point, what they want to learn, research questions, their working bibliographies, and search terms/strategies.
  • Next Steps–specific tasks and suggestions to help the students move forward.  These action steps could also include requests for students to schedule 1:1 help or to participate in some of the small-group help sessions we set up in response to the patterns of thinking and gaps we saw in the proposals.

rust-feedback1

rust-feedback2

I went through each proposal and typed in my feedback for each student in the appropriate fields in the database document.   I then used the Mail Merge wizard in Word to create a “form letter” that imported this feedback and printed out the feedback documents for each student on colored paper or in color.  Once I printed completed feedback forms, I stapled them to the research design proposal draft and returned to the student as soon as possible for them so that they could move forward or make revisions.  I also provided a copy to Sarah so that she could begin developing a list of needs to address and to prioritize which students needed her help and areas of expertise.  The master database provides us an archived record of the formative assessment to use as we look at student growth; it is also easily accessible to reprint should a student lose his/her copy of the feedback form.

It did take quite a bit of time to methodically go through each proposal and to generate the personalized feedback.  However, I so appreciate the opportunity to engage in this sort of assessment because it helps me get to know the students as learners.  This work also improves my instruction because I can easily see patterns of understanding and confusion and helps me to be a more reflective and effective practitioner as well as instructional designer.

Conferencing/Coaching/Triage 1:1 and Small Group

help help help help 2

 

help help 3

Using this information as our starting point, Sarah and I have  been meeting with students the last few days (late last week and all of this week)to discuss the feedback we’ve provided them ; we use the feedback forms as a strategic entry point for face to face conference/coaching conversation with students.   We have been organizing our 1:1 meetings and small group sessions through a variety of mediums each day:

  • Students can sign up for specific individual help each day—we have used large post-it paper and our Verb dry erase boards as our parking lots for students to indicate they need assistance or have questions.
  • Students can sign up for small group help or indicate they want to join a future small group work session through our Verb dry erase boards.  For example, after reviewing all the research design proposals, I realized I needed to do some small group instruction on additional search techniques with Boolean operators and additional instruction on mining Academic Search Complete.
  • For those who might be shy or reluctant to place themselves in one of these help request parking lots, we’ve also been sure to work through our class rosters and are checking in with each student so that we are sure to meet with EVERY student and “check up” on their progress, successes, questions, and worries.

Yesterday, Sarah called students up by the class roster whereas I started with my list of student requested help.  Today we approached the scheduling of the 1:1 conferences by working through the class rosters and having students first check in with Sarah about some of their recent process work; students then moved to my table for to discuss the feedback they received from us on their research design proposals.   We each set up a help area with our mobile tables and our green Hon rolling chairs so that we had comfortable spaces to talk to students and where they could spread out their work and/or where we could show them specific resources or skills on our laptops if they needed some concrete visualization or examples.  Some conferences are brief while others are more extended, but typically, each meeting can last 3-10 minutes—it all depends on student need and how the conversation evolves in the conference.  We also keep notepads, large lined sticky notes, and/or Google Docs available at the conference table to jot down notes from each meeting while students bring along their folders of their process work, drafts of their design proposals, and the individualized design proposal feedback form.

student conf notes

In just these first few days we’ve been meeting with students, it’s very apparent when students feel confident (and skills/processes/ideas they’re self-assured about as well) and where students feel fuzzy, unsure, and/or anxious.   We’ve also observed that most of our students are not used to this level of accountability, and some seem a bit uncomfortable with it when you are asking them questions to nudge them to dig deeper or be more specific with details; we sense many are also not used to these types of conferences that puts the responsibility and decision making on them as students.   We are framing this conference/coaching sessions from a stance of discussions to help them think through their choices, to clarify their own thinking/choices/next steps, and to move forward with their projects since we don’t want them to see the messiness and muckiness of inquiry as punitive.   These sessions have also helped us identify those who might benefit from some of our upcoming small group mini-lessons but who may not have initially signed up for assistance.  Last but not least, I believe these conferences convey to our students that each person matters and that we care about them and their topics.

Reflections

While we cannot do their work for them, we can give students every opportunity to get personal assistance in a low-key setting —we want them to know they cannot fall through the cracks or simply fly under our radars.  While I’ve done this sort of work before, this is probably the biggest chunk of time I’ve had in a collaborative partnership for this level of assessment and 1:1 student conferencing.  This approach requires us to be agile and responsive as each day is different and every student need varies.  This kind of conferencing/coaching is time consuming and messy; while the prep for the small group work is pretty straightforward, the 1:1 help is definitely open-ended.  I have been inspired and am improving my own conferencing skills with students by watching Sarah (who is a master at this process) and by my friend and fellow school librarian Heather Hersey.  Her post about the importance of conferencing helped me to think about focusing on all aspects of their inquiry work and design proposals rather than just sources or their bibliographies; it also inspired my idea for using the mail merge form and database to capture feedback and use that as a starting point for the student conferences/coaching sessions.

Sarah and I have also been discussing how intense this kind of work is and how you have to be comfortable with making adjustments as needed to timelines and your plans in order to be responsive to the students.    The processes are messy, yet this “mucking around in ideas” is the grist for the growth and critical thinking that happens as both we and our students problem solve, question, and revise our ideas and stances.  Neither of us has any idea how someone would do this kind of process-driven, organic, fluid, and reflective work alone!  We love that our combined talents help the students as well as each other; we also are appreciative of having someone else each day who can help you see things you might have missed or to think about a particular situation or challenge with fresh eyes.   We are also excited we can model collaborative learning for our students—how often do they get to be in a learning environment where there are at least 2-3 adults who can help them and provide them the kind of specific and personalized attention they are receiving?  Most importantly, this type of collaboration is a catalyst for inquiry work and for integrating more formative kinds of assessments that benefit students and impact learning.

We expect the 1:1 and small group conferences, coaching, and small group instruction to continue the next 7 days of school leading up to our Thanksgiving break.  I hope to share more images, video, written/video reflections and feedback from both of us as well as our students in an upcoming post later this month.   I’m also thinking about how to better integrate the conferencing/coaching/conversation aspect into the inquiry approach (and at an earlier point in time) with research using Cris Tovani’s conceptualization of these conversations as data and formative assessment (see her text, So What Do They Really Know?  Assessment That Informs Teaching and Learning).

Cris Tovani's Conversation Calendars

How are you approaching assessment with inquiry work?  How do you negotiate and embrace the challenges of time and fluidity with this approach to learning and research?  How do you scale this kind of learning experience when there are always challenges of time, space, and staffing?

Simple Yet Powerful Formative Assessment of IR with Sarah Rust

IR Sticky 3

Every Wednesday is Independent Reading (IR) day here in our Language Arts classes here at NHS.  Today, Language Arts teacher Sarah Rust, one of our awesome collaborative partners, did this very simple yet interesting formative assessment with her students.   The instructions:

IR Post It Instructions Rust

Students selected a sticky note of a color of their choosing and then composed their responses.  As an extra touch to celebrate the concept of IR, Ms. Rust then took their responses and fashioned them into the letters “IR.”   While this idea seems simple on the surface, the student responses were revealing and showed a wide range of book selections as well as reactions to the IR experience.  These can be a springboard to future IR learning activities and learning experiences for book selection and peer sharing.

IR Sticky 2

 

IR Sticky 4

 

It’s another reason why sticky notes are my favorite “technology” as of late!  This approach is a great way to do a quick individual assessment of student learning or where they are with their current IR as well as make an artistic class statement that represents every student voice.

Sticky Notes as Formative Assessment for Information Literacy Instruction: Coding Student Responses

Yesterday I blogged about our pre-searching activities and the use of sticky notes for some gentle formative assessment.  Today I want to share how I went about coding the student responses not only to get a sense of students’ thinking during the two days of pre-searching, but to also use the data as a baseline of sorts in hopefully looking a broad collection of their work as we try to track their trajectory of growth and progress through this extended research unit.

Coding Information Sources

I began by removing the sticky notes for each period from the whiteboards and affixing them to large post-it notes and labeling each grouping by period and response type.  The next challenge was to think of categories for coding the student responses.  The “information sources used” was the easiest starting point, so I began there.

Coding "Information Sources Used" Sticky Notes from Days 1 and 2 of PreSearch, 3rd Period #rustyq

I listed all the information sources from the LibGuide for the project and then tallied responses.  I wound up adding Google as another category since some students indicated they had used this search engine.  Here are the results by period:

2nd period Rust Sources Used Sticky Note Data PreSearch October 2014

 

3rd  period Rust Sources Used Sticky Note Data PreSearch October 2014

In both classes, it appears Gale Opposing Viewpoints was a starting point for the majority of students; Gale Science in Context was next in popularity.  2nd period seemed to like SweetSearch and self-selected information sources while 3rd period leaned more heavily toward Academic Search Complete.

When we look at the updated topics roster (while taking into account the intiial list of topics they had generated), the numbers are not too surprising.  I know that many of them will benefit from some guidance into specific databases and search tools that will align with their topic choices as we move deeper into the project, but I’m not terribly surprised by what I see from the first two days of the risk free pre-search time to just hone down an interest area for one broad topic.  This data, though, does suggest to me that there may be sources unfamiliar to students or they have used minimally in the past (as do the results from the information literacy skills needs survey we did via index cards with Ms. Rust a few weeks ago).

Questions

My categories for coding the questions students generated included:

  • Who
  • What
  • Where
  • When
  • How or Why?
  • Topic Clarification
  • Question about the research or the assignment
  • Other (other types of questions i.e. Is Finland’s educational system superior to the United States?)
  • None

2nd period posed 15 “how/why” questions and 11 questions that fell under “other”; there were four “who” questions and 6 “what” questions; three students did not note any questions.  3rd period generated questions that primarily fell under “what” (4), “how/why” (4), research/assignment questions (6), or “other” (6); five students did not generate any questions.  Clearly, there is a stark contrast between the two classes in the types of questions they generated.  This data may indicate that 3rd period may need more guided help in engaging more deeply with their articles OR strategies for generating questions.

Discoveries and Insights

For this group of sticky note responses, I created these coding categories:

  • Fact or concrete detail
  • Concept/Conceptual
  • Question
  • Reflection
  • Commentary/Opinion/Reaction

Once I began taking a pass through the student responses, I realized I need four additional categories:

  • Topic Ideas
  • Sources
  • None
  • Other

Second period students primarily recorded facts or concrete details for their notes; however, several used this space to think through additional topic ideas; the pattern was nearly identical in 3rd period.  I was not surprised by these findings since students spent only two days doing light pre-search and I knew in advance that getting enough information to eliminate some topic areas of interest would be where many would expend their time and energy.

Final Thoughts

The pre-search activity and days were designed to help students rule out some topics and have time to explore those of interest and our sticky note method of formative assessment was one we felt would give us feedback without imposing a structure that would be time-consuming for students since we really wanted them to channel their energies into reading and learning more about their topic lists.  While some of the data I coded was not surprising, I was really struck by the differences in the types of questions they generated.  Right now I don’t know if this means one class might need more help in generating questions from informational texts or if perhaps they were approaching the reading and activity in a different way that didn’t lend itself to composing lots of questions at that early juncture in time.

If you are incorporating pre-search as part of the connecting cycle of inquiry, what kinds of formative assessments do you use?  If you code student responses, how do you approach that process, and how do you use that data to inform your instructional design?   I find this kind of work interesting—I am looking forward to seeing if any of these gray areas or perceived gaps come to light as we move further into our research unit this month.

Connected Learning and Implications for Libraries as Spaces and Mentors for Learning

“Connected learning is realized when a young person is able to pursue a personal interest or passion with the support of friends and caring adults, and is in turn able to link this learning and interest to academic achievement, career success, or civic engagement.”
from Connected Learning:  An Agenda for Research and Design

For the last month or so, I’ve been dwelling in Connected Learning:  An Agenda for Research and Design, a research synthesis report that outlines the research and findings of the Connected Learning Research Network, a group chaired by Dr. Mimi Ito.  In addition to the report, I’ve enjoyed the series of recent webinars centered around the report:

Supplementary readings have also informed my understanding of this report:

Additional definitions and explanations can be found here; the infographic embedded here is also a helpful visualization.

In “Connected Learning:  An Agenda for Social Change”, Dr. Ito asserts that connected learning:

“…is not about any particular platform, technology, or teaching technique, like blended learning or the flipped classroom or Khan Academy or massive open online courses. It’s agnostic about the method and content area. Instead, it’s about asking what is the optimal experience for each learner and for a high-functioning learning community?”

In the Connected Learning:  An Agenda for Research and Design report, the authors describe connected learning as a design model:

“Our approach draws on sociocultural learning theory in valuing learning that is embedded within meaningful practices and supportive relationships, and that recognizes diverse pathways and forms of knowledge and expertise. Our design model builds on this approach by focusing on supports and mechanisms for building environments that connect learning across the spheres of interests, peer culture, and academic life. We propose a set of design features that help build shared purpose, opportunities for production, and openly networked resources and infrastructure” (5).

I’ve recreated this visualization embedded in the report to provide another way of looking at connected learning and thinking about how this model seeks to “knit” together the contexts of peer-supported, interest powered, and academically oriented for learning (12):

Slide1

I’m still coding and organizing my notes from the report as I try to pull out the big takeaways for me, but as I review these notes and the ones I took from the webinar on assessing connected learning outcomes last week, I’m thinking about this first wave of big ideas and questions:

  • How do libraries develop learning agendas that are aligned with agendas for social change in their community?  How do the two inform each other?
  • How can libraries embrace this approach to designing learning environments to help us move from “nice to necessary?”, a question that was posed at ALA Midwinter in 2013, and that I’m attempting to flesh out in my work here as a Learning Strategist at Cleveland Public Library (and that I hope to share with you later this year).
  • How do we create learning environments and experiences as well as relationships with those we serve to move beyond the initial “sweet spot” of attachment to building a deeper level of engagement?  How do we as librarians (with the help of our community) design learning environments that provide diverse entry points and access for people to form communities of learning where they can create more nuanced narratives of learning as they create, share, and connect with others?  How do we design learning spaces and experiences that create more “pathways to opportunity” and participation?
  • How might libraries of all kinds serve as an “open network” that is a medium and a mentor to helping people connect and move more meaningfully across multiple learning spaces and spheres within their local community as well as a larger and more global community of learners?  Kris Gutierrez’s metaphor of “learning as movement” across many kinds of contexts has spurred this thinking.
  • Kris Gutierrez and Bill Penuel discussed concepts of horizontal learning and boundary crossing in their webinar and explored the question of how do we help people leverage the practices, disposition, and expertises honed in one learning space to another to go deeper with that learning and expand the possibilities for action and participation.  How do libraries support communities of learning in engaging in this boundary crossing and engaging in horizontal learning to build greater personal as well as civic capacity?
  • Both Gutierrez and Penuel emphasized the need to further contemplate and explore individual and collective assessment of these practices.  In the words of Dr. Gutierrez, “What tools, dispositions, practices, forms of expertises TRAVEL and how do we know it when we see it?”  I’m also thinking about how we frame formative and summative assessments as touchpoints for learning.
  • How can librarians help people take deep “vertical knowledge” in a specific content area and apply it across multiple learning contexts and spaces?  This question relates to horizontal learning and boundary crossing.  I like to think of these concepts as cross-pollination of ideas and learning.
  • How do more effectively build vocabulary for this kind of learning in our learning communities?
  • How do we more effectively thread and address issues of equity across our instructional design and assessment processes?
  • How do libraries cultivate deeper and more meaningful partnerships and connections with other institutions of learning in their communities for more strategic impact?
  • How do we as librarians facilitate the creation of sustained networks to help people make connections between social, academic, and interest driven learning? ( see page pp.46-47 in the report for more on this question)

As you can see, these learning and design principles as well as the findings and concerns shared in the report have saturated my thinking.  As I make additional readings and passes through my notes from the report, I will continue to take an inquiry stance to further unpack the concepts and language embedded in this work.  I’ll also revisit the case studies included in the report to further develop ideas on what this work could look like in practice in different library settings.  In addition, I will carve out more time to listen as well as contribute to conversations about connected learning in the NWP study group as well as the Connected Learning Google Plus group.

Teachers as Learners Conference Keynote and Concurrent Session Presentations

I want to thank the wonderful educators of the Griffin-Spalding County School System in Griffin, Georgia for inviting me to present a keynote speech and four concurrent sessions this past week at their Teachers as Learners Conference.  Below are resources from two days of learning and sharing!