Our clients are by and large genuine change-makers, motivated to measure and achieve the positive outcomes they seek. And one of our most important jobs is helping them develop and use appropriate data to enhance discovery, analysis, insight and direction. But client commitment and our professional responsibility don’t always avoid some common data collection pitfalls.
Through countless evaluations of school and district level educational programs, as well as multi-site, statewide initiatives, we have identified the pitfalls that follow. They may seem like no-brainers. But that’s what makes them so easy to fall into, even for seasoned evaluators and educational leaders. We highlight them here as a reminder to anyone looking to accurately measure their impact:
1. Asking leading a question versus a truly open-ended one. If you aim for honesty, you must allow respondents to give negative responses as well as positive ones. For instance, asking:
“How would putting an iPad into the hands of every student in this district improve teaching and learning outcomes?”
…assumes teaching and learning outcomes will be improved, at least to some degree. (more…)Continue reading
That women are traditionally underrepresented in technology-related careers will surprise no one. It is something we’ve seen in our work, our research, and with our own children. In the 2015 ARS evaluation of STEM, Inc., a coding and entrepreneurship project designed for middle school students, only 32% of participants were girls and they were almost half as likely as boys to have any prior coding or robotics experience (35% versus 65%).
Moreover, a recent survey of over 5,700 middle school students found that boys agreed more with the statement they are good at solving computer problems. Boys are also more likely than girls to say they plan to study computers in college; they are more likely to create technology; and they demonstrate a more positive attitude toward computers and computer classes. Among our own middle and high school aged children, we note significantly more external encouragement toward coding and technology among boys than among girls, manifest in recruitment and participation in after school coding clubs and in AP Computer Science course participation. All of these factors contribute to the significant decline in young women’s pursuit of computer science degrees and the current lack of gender parity in the technology workforce. (more…)Continue reading
When President Obama signed the bipartisan Every Student Succeeds Act (ESSA), which reauthorizes the Elementary and Secondary Education Act (ESEA) of 1965, into law, we had two reactions: 1) Finally! Congress and the President Obama have at last replaced No Child Left Behind (NCLB) and 2) Holy cow! ESSA significantly changes key aspects of our work.
Like most educators, though, we embrace both responses. And we’ve been busy working through what ESSA means for our work and our clients. In doing so, we have come across multiple summaries and commentary worth sharing, which we’ve summarized below.
If you prefer the long version, you can find the full text of the new law here. But our primary takeaway is that ESSA provides targeted resources and tailored prescriptions designed to return accountability and decision-making for student success to state and local leaders. (more…)Continue reading