DESIGNED-BASED RESEARCH IS WHAT COLLABORATION SHOULD LOOK LIKE
The U.S Department of Education’s Office of Educational Technology released a 100-page draft policy report on “Expanding Evidence Approaches for Learning in a Digital World.” While a key focus of the report is on the kinds of information that we should marshal to evaluate learning technologies, the more important lesson of the document is about people. Through case studies and reviews of current research, the report makes a lot of recommendations. The fifth report recommendation, which cites the work of the Carnegie Foundation, states that the people who use digital learning resources should work with education researchers “to implement resources using continuous improvement processes” (p 89). This kind of continuous improvement based on data is baked into the approaches many developers and technologists bring to the field.
A prime example of this always-testing and always-improving process can be found in the Carnegie Foundation-initiated Statway project. In an attempt to double the number of students who earn college math credits within one year of continuous improvement (p 21), the schools involved in the project agreed to collaborate with one another, with researchers and developers, and with those who implemented the new programs (teachers). Moreover, they agreed to share the data they gathered and then discuss how to refine the implementation. After a small first iteration of the project produced lackluster results, a team redesigned the course, and a new version rolled out across the entire network the next school year. In the first year of Statway at the participating colleges, three times as many students earned a college math credit in one-third the time compared with historical averages.
The report authors interviewed Louis M. Gomez of UCLA, a Statway collaborator, about whether the networked schools had conducted an “efficacy study” comparing the new project with traditional methods. “Efficacy studies” fall within the domain of academic education research, and test whether an intervention can achieve a desired effect under ideal conditions. They are in essence the polar opposite of design-based research, which deals with real-world situations in all their messy and imperfect glory. Gomez’s reply is telling:
"All kinds of promising interventions are subjected to RCTs [randomized controlled trials, the “gold standard” of academic education research] that show nothing; often because they’re subjected to [experimental studies] too early. Equally important to work on is getting your intervention to work reliably across many different contexts. This is more important at this point than understanding whether Statway works better or worse than some other approach."
The article is in EdSurge.
HOW TO MAKE STRESS WORK IN YOUR FAVOR
New research suggests that all the attention to the risks of stress may actually be part of the problem. Though it tends to get lost in the frenzy, our stress response evolved to do us good; psychologists have long recognized that under the right conditions, it can improve mental and physical health and boost athletic and cognitive performance. And researchers are finding that one way to unleash this positive side of stress is simply to retrain ourselves to think of it differently. “There are public-health messages everywhere telling us how bad stress is for us,” says Alia Crum, a psychologist at Columbia University’s Business School. “I ask people how that makes them feel, and the answer is, ‘stressed.’”
The goal is to find ways to flip the stress-response cycle from bad to good. Psychologist Jeremy Jamieson, for instance, is working with the Carnegie Foundation for the Advancement of Teaching to help community college students in a remedial math course reappraise their response to math exams. These students have a history of failure, which Jamieson says leads to “math anxiety,” clogging up the working memory that is critical for math. The article is in the Boston Globe.