The best educational measurement strategy is the one that fits your specific circumstance, not the hottest method of the day. And not necessarily the ones that pols believe are singularly able to deliver “real, hard data.”
The What Works Clearinghouse (WWC) at the Institute of Education Sciences (IES), for example, reserves its highest rating of confidence for studies based on well-implemented Randomized Control Trials (RCTs), arguably the gold standard in evaluation. RCTs are not only credible, but are presumed to be top-of-the-line in eliminating bias and uniquely capable of surfacing replicable results (assuming fidelity to the original model).
Much has been written in the past year about whether RCTs deserve the vaunted status they’ve been assigned (see, for instance, the debate between Lisbeth Schorr and the Center for the Study of Social Policy’s “Friends of Evidence” group and Patrick Lester of Stanford’s Social Innovation Research Center). (more…)
Continue readingOur clients are by and large genuine change-makers, motivated to measure and achieve the positive outcomes they seek. And one of our most important jobs is helping them develop and use appropriate data to enhance discovery, analysis, insight and direction. But client commitment and our professional responsibility don’t always avoid some common data collection pitfalls.
Through countless evaluations of school and district level educational programs, as well as multi-site, statewide initiatives, we have identified the pitfalls that follow. They may seem like no-brainers. But that’s what makes them so easy to fall into, even for seasoned evaluators and educational leaders. We highlight them here as a reminder to anyone looking to accurately measure their impact:
1. Asking leading a question versus a truly open-ended one. If you aim for honesty, you must allow respondents to give negative responses as well as positive ones. For instance, asking:
“How would putting an iPad into the hands of every student in this district improve teaching and learning outcomes?”
…assumes teaching and learning outcomes will be improved, at least to some degree. (more…)
Continue readingU.S. school systems are crafting new approaches to desegregation in response to increasing evidence of growing racial isolation and strong evidence of the value of integration. Assuring the success of these initiatives, including creating and sustaining community support, requires clear thinking about measurement and evaluation. And while translating the rich body of integration-related social science research to actionable evaluation can be daunting, avoiding simple “box score” approaches to integration measures can help districts achieve deep, sustainable reform. We, therefore, propose a framework for evaluating broad-scale desegregation initiatives that considers:
The Department of Defense recently announced the 2016 round of Grants to Military-Connected Local Educational Agencies for Academic and Support Programs (MCASP), designed to support military-connected schools. These have been important programs for schools that serve military families and students, and our evaluations of MCASP programs have shown them to be effective. If your school or district serves military connected families, you should review the application and consider applying. (more…)
Continue readingWe were pleased to release the Arroyo Research Services Evaluation of the Texas Dropout Recovery Pilot Program: Cycles 1 and 2 in May 2011. Conducted from 2008 through 2011, the evaluation assisted the Texas Education Agency in examining the effects of the TDRPP pay for performance model that directly tied project payments to demonstrated student academic progress and program completion. Full results and program descriptions are included in the link below; summary results are presented in the Executive Summary. Key findings from the evaluation include:
– Grantees served 4,141 students, twice as many as projected.
– 1,283 students completed the program by earning a high school diploma or demonstrating college readiness.
– The average TDRPP graduate is expected to earn $246,348 more in his or her lifetime than a high school dropout.
– TDRPP is expected to save the state $95.3 million in current dollars after accounting for initial program expenditures.