Student Engagement Exit Ticket

TECHNICAL PAPER

Student Engagement Exit Ticket

Student Engagement Exit Ticket as a Practical Measure

Characteristics of Practical MeasuresDescriptions of the Student Engagement Exit Ticket
Is closely tied to a theory of improvementAlthough the Student Engagement Exit Ticket measure was not tied to any formal theory of improvement, it was in alignment with teachers’ hypotheses of how changes they made would lead to improvements.
The measure is associated with a problem of practice (i.e., difficulty motivating students) shared by a group of teachers, and it assesses ideal outcomes identified by the group of teachers (e.g., students’ intrinsic interest in lessons). Teachers who created the measure together used it to test their individual change ideas and learned from one another.
Provides actionable information to drive positive changes in practiceBecause the measure was primarily designed by teachers, the data collected using the survey could provide a window for teachers into aspects of their students’ engagement that they cared about most and worked hardest to improve. This should increase the likelihood of them acting on the data meaningfully and constructively.
Captures variability in performanceFor the group of teacher-designers involved, the measure served as a common measurement for improvement. By examining variation in the common measure across classrooms, teachers could answer questions about what works, for whom, and under what conditions.
Demonstrates predictive validityThe measure was constructed in the midst of the COVID-19 pandemic to solve urgent problems of practice that arose as schools transitioned from in-person to remote learning. Hence, the predictive validity of the measure was not studied. 
In the process of developing this measure, teachers provided look-fors related to their problem of practice and identified success indicators. Though not formally studied, this approach to the design enabled teachers to make informal predictions about variation in student engagement.
Is minimally burdensome to usersDuring the outbreak of the COVID-19 pandemic, students were asked to fill out many surveys. A lot of them experienced survey fatigue. To make it easier on students, the professional learning design team led by Annie Wilhelm, Associate Professor at the Southern Methodist University, kept the measure short – only two close-ended items were included. Also, the survey was built using Google Forms (a tool that the teachers and students had previous experience using) and was administered during class time. As a result, students should be less likely to perceive completing the measure as “one more thing” to do.
Functions within social processes that support improvement cultureThe measure is a product of group ideation. As teachers brainstormed outcomes worth tracking, they engaged in collaborative visioning of what success could look like. Once a shared vision was established, teachers could then 1) backward-map the changes they would make individually to actualize the vision, 2) implement the changes, 3) collect data using the measure to learn about the implementation, and 4) participate in collective sense-making by going through a notice-and-wonder data protocol. All of these processes together transformed the measure from a conventional exit ticket to a learning tool for improvement.
Is reported on in a timely mannerTeachers who tested their change ideas using the measure received reports at a professional development session following testing. Such a delay might make it harder for teachers to make connections between student experience and practices they tried as part of their change ideas, especially if they did not also document those practices.

Question on Practical Measures Inspired by the Student Engagement Exit Ticket Measure

What can improvers do to make the design process of practical measures more practical?
It is not uncommon for improvers to find the process of developing practical measures laborious. Sometimes, even after a lot of hard work has been put into designing a practical measure and making it “rigorous,” those who are closest to the work, such as teachers, still find it difficult to see how using the measure can help them improve their practice, and the measure may end up sitting on the shelf.

One way to address this challenge is to involve practitioners early on in the design process of practical measures. Annie Wilhelm, who built the design process for the Student Engagement Exit Ticket, said, “I was trying to make the [design process] as close to [teachers’] practice and their needs as possible.” For Annie, it’s important for teachers to have a sense of agency over their own learning. Although the teacher-driven process that she crafted was “possibly wrong and definitely incomplete,” it “got something out there that would give [teachers] something to work with.”

Annie’s example demonstrates that when it comes to bringing in teachers to create practical measures, it is okay to start small. In addition, modeling and scaffolding might be needed to help teachers better understand not only the steps but also the value of the process. As their capacity builds, teachers will be more prepared to iterate on practical measures to optimize their rigor, impact, and practicality.


Related

BLOG

Student Engagement Exit Ticket

For a measure to be considered practical, it must be minimally burdensome to users and attend to the social processes of use that…

Read

You may also be interested in…

RESEARCH BRIEF

Developing Exit Tickets in an Improvement Network

Ms. Johnson, an English teacher at an urban middle school, was concerned that her students were not doing as well in her class as they could be. Her district had recently…

Read

RESEARCH BRIEF

Case Study of ORF as Practical Measure Brief

In this post we describe the identification and use of Oral Reading Fluency (ORF), a common and widely used Curriculum Based Measurement (CBM) measure, as a…

Read