Last year, Carnegie engaged Jennifer Zoltners Sherer from the University of Pittsburgh to work with a team of Carnegie staff to explore the potential of math intensive programs as a strategy for addressing the failure rates of developmental mathematics students in community colleges. These math intensive programs include boot camp programs, summer bridge programs and accelerated programs aimed at shortening the developmental math sequence or supporting students to test out of the sequence altogether.
Sherer and Alicia Grunow prepared this report after exploring programs using a 90-day cycle process borrowed from the Institute for Healthcare Improvement (IHI). The IHI 90-day cycle scans activity in the field as a “quick way to research innovative ideas and assess their potential for advancing quality improvement.” The goal was to “get under the hood” of these intensive math interventions in order to identify program specifics, synthesize evidence regarding their efficacy and costs and elicit deeper understanding of the cause-effect logic of their design.
Overall, the programs described in the Sherer/Grunow report have employed a variety of innovative strategies to improve student outcomes in developmental math. Although these intensive programs are targeted at students with developmental math needs, they often support more than just math. They help students learn about college support systems, teach study skills and serve as opportunities for students to build relationships with peers, mentors and faculty. One of the most intriguing elements common to many of these programs was their use as an onramp; carefully designing transitions between high school and college. The attention paid to the problem of transition across many programs and contexts suggests further exploration into how these transitions can be effectively executed.
As a group, the interventions are characterized more by variation than similarity. Even within interventions that went by the same name, the team encountered differences both in how they were structured and in the elements of their design. In other words, the scan produced evidence that these programs are more a set of local solutions than a class of intervention that is currently well enough understood to be leveraged at scale. Assessing the potential of any one of these localized programs to scale to other contexts will require instrumentation of key elements, implementation in multiple contexts, and common measures of effectiveness.
Carnegie continues to be interested in the utility of a 90-day cycle process for investigating promising educational innovations that have not yet been adequately explored in the educational literature, without tying up significant resources or encountering unnecessary delays. In particular, Carnegie is learning how to design 90-day cycle methodology that captures intervention activity in a way that allows for an initial assessment of their potential to produce improvement at scale.
August 24, 2010
In The NYT article, Scholars Test Web Alternative to Peer Review,” Patricia Cohen advocates using the Internet to expose scholarly thinking to the swift collective judgment of a much broader audience.
January 5, 2011
In a recent Education Week article, representatives from private and government organizations concerned with education research lined up behind the 90-day cycle model, developed by the Institute for Healthcare Improvement (IHI) in Cambridge, Mass., as a way to accomplish “deep-dive, quick turnaround” education research. Carnegie leadership spent a week at…