Carnegie’s Productive Persistence (PP) Subnetwork is employing the tools of improvement science—using real student data, creating new ways to analyze that data, and conducting research in their classrooms—to specifically address the problem of student motivation, tenacity, and skills for success in the Community College Pathways (CCP).
The Productive Persistence subnetwork is a cross-college collaborative of faculty members and Carnegie researchers and staff who are organized to problem solve the improvement of specific drivers that determine whether a student remains in the classroom and is successful.
Drivers and Change Ideas
Leading up to the fall term, the Productive Persistence subnetwork team used results from previous Productive Persistence surveys and the research literature to identify three drivers that affect students’ social ties in the classroom: a sense of belonging, a sense that professors care about them, and their feelings of comfort in asking questions.
The research shows three drivers that affect students’ social ties in the classroom.
These drivers were selected because data from Pathways students indicated that these were closely related to pass rates (C or better) and persistence rates (students enrolling in the next term of Statway). These drivers helped us to focus our work on areas we think can significantly benefit students. In the process, we developed a new way to conduct research, to gather information about our students, and to look more deeply at Productive Persistence. Based on this work, subnet members designed “change ideas” around one of the three drivers and then conducted Plan-Do-Study-Act (PDSA) cycles, a central improvement science tool, linked to these drivers.
Data is critical in testing ideas. Subnet members used data in ways they never have before. For example, members analyzed data on attendance and the number of students who asked questions, and designed custom student surveys on run charts, another critical tool from improvement science.
Aaron Altose from Cuyahoga Community College began to develop a routine where students received a “question card” they would hold onto each week. As students asked questions, Aaron then collected the cards and was able to track of the number of students asking questions. Aaron documented his observations and learning on standardized PDSA cycle forms. “Writing up the cycles was helpful for me,” he said. “I reflected more deeply on what I was trying to do and it helped with planning for how I would modify it, what I would expect out of it and what I would hope to see.” He said that he feels there is promise in his change idea, and will be continuing his PDSA tests during the winter/spring term.
Nicole Gray from Foothill College also tried new routines for data collection on student questions. She tested a process where students helped her collect data on questions asked in class. Consistent with improvement science principles, she started small by creating a simple form and getting one student’s feedback on the form. After a couple of cycles of getting more feedback, she had a student use the form during class. Even though the goal of the tests was to streamline the data collection process, there were additional benefits. “I also learned that the students are quite interested in participating in these things,” she said. “It helped them actually focus in the class more. They generally were excited about the idea that I cared about how students participated.” From her work, we now have a tool that can streamline data collection moving forward. Her next steps are to test routines that she thinks will further encourage student engagement.
We’ll post more soon from faculty interviews in Carnegie’s Networked Improvement Communities (NICs) about their subnetwork experiences. We have additional learning about instructional routines and we will delve into how faculty are dealing with the challenges of using improvement science to improve teaching and learning. Until then, we would love to hear from other faculty members who might share their efforts.