February 11, 2021

Arroyo Research Services is pleased to invite applications for three new positions to our growing national education research, measurement and evaluation practice: A Senior Associate who will lead projects and engage with senior education leaders, an Associate who will conduct evaluations and provide a variety of research and evaluation services to multiple projects, and a Research Associate who will contribute data collection, analysis and writing.

About Arroyo
Arroyo Research Services is an education professional services firm that uses the tools of social science to help school districts, state departments of education, universities and education organizations achieve the highest standards of educational excellence and equity. Our services include research, measurement and evaluation, strategic planning, technical assistance and related data systems. Our work includes programs designed to advance educational outcomes for underrepresented youth from birth through college, serve migrant youth and families, build optimal health among teens, increase STEM learning and educational attainment, and promote STEM-related innovation. Based in Asheville, North Carolina, the firm was founded in 2005 in Los Angeles.

Who We Serve
We care most about projects that seek positive social change through educational programs. These include programs designed to improve educational outcomes for the children of migrant farm workers, providing a pathway to and success in university STEM programs and careers, promoting women in technology, achieving optimal health among teens in distressed communities, and ensuring access to culturally responsive teaching of mathematics and science. Recent clients include the State Education Agencies of Arizona, Arkansas, Colorado, Florida, Georgia, Hawaii, Illinois, Iowa, Kentucky , Maine, Maryland, Mississippi, Montana, New York, Ohio, Pennsylvania, Texas, and Virginia; school districts including Fairfax County Public Schools (VA), Hillsborough County Public Schools (FL), Columbus and Dayton Public Schools (OH), Baltimore County Public Schools (MD), and Socorro Independent School District (TX); organizations that include The Gallup Organization, Learning Forward, Hudson Alpha Institute for Biotechnology, Strada Education Network, and Reach Virginia; and universities including California State University, Fullerton, University of South Florida, University of Virginia, State University of New York and others. Our work is funded by the National Science Foundation, U.S. Department of Education, National Institutes of Health, Department of Defense Education Activity, the Sandler Family Foundation, Apple Federal Credit Union Education Foundation and others.

How We Work
Work at Arroyo is both collaborative and independent. You work with a remote team with diverse experience across the United States. Team members are equipped as professionals with the computers, software, phones and connectivity to succeed. We use a variety of industry standard communication, project management and analysis tools including Google Workspace, Zoom, Microsoft Office, Asana, Harvest, R statistics, and others. You will receive regular feedback on your work designed to help you grow as a professional. Team members typically work on multiple projects at the same time which keeps the work interesting and provides areas of both comfort and growth.

Who Should Apply
Arroyo Research Services is committed to inclusion and diversity and is an equal opportunity employer. We welcome applications without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, Veteran status, or other legally protected characteristics. DACA recipients are encouraged to apply.

To Apply

  1. Review the specific job description: Senior Associate, Associate, Research Associate
  2. Submit an email to that includes a cover letter and resume. Include the position title in the email subject line. Make sure document file names include your first and last name, e.g. or similar.
  3. We will respond to each submission with next steps.
Continue reading

The best educational measurement strategy is the one that fits your specific circumstance, not the hottest method of the day. And not necessarily the ones that pols believe are singularly able to deliver “real, hard data.”

The What Works Clearinghouse (WWC) at the Institute of Education Sciences (IES), for example, reserves its highest rating of confidence for studies based on well-implemented Randomized Control Trials (RCTs), arguably the gold standard in evaluation. RCTs are not only credible, but are presumed to be top-of-the-line in eliminating bias and uniquely capable of surfacing replicable results (assuming fidelity to the original model).

Much has been written in the past year about whether RCTs deserve the vaunted status they’ve been assigned (see, for instance, the debate between Lisbeth Schorr and the Center for the Study of Social Policy’s “Friends of Evidence” group and Patrick Lester of Stanford’s Social Innovation Research Center). (more…)

Continue reading

When asked publicly or privately about high stakes assessments for teachers and schools, we always say the same thing: don’t go there. Using value-added models based of student test scores to reward or punish teachers misdiagnoses educator motivation, guides educators away from good assessment practices, and unnecessarily exposes them to technical and human testing uncertainties. Now, to be clear, we do use and value standardized tests in our work. But here’s a 10,000-foot view of why we advise against the high stakes use of value-added models in educator assessments:

  1. Using value-added scores to determine teacher pay misdiagnoses teacher motivation.

When Wayne Craig, then Regional Director of the Department of Education and Early Childhood Development for Northern Melbourne, Australia, sought to drive school improvement in his historically underperforming district, he focused on building teachers’ intrinsic motivation rather than the use of external carrots and sticks. His framework for Curiosity and Powerful Learning presented a matrix of theories of action that connect teacher actions to learning outcomes. Data informs the research that frames core practices, which then drive teacher inquiry and adoption. The entire enterprise is built on unlocking teacher motivation and teachers’ desire to meet the needs of their students. (more…)

Continue reading

We are pleased to deepen our work of evaluating educational programs serving highly mobile students through three new Department of Defense Education Activity (DoDEA) MCASP grants: to Hillsborough County Public Schools, Socorro Independent School District and Fairfax County Public Schools. These new projects extend our prior work on behalf of

Tinker K-8 - photo by Airman 1st Class Danielle Quilla
Tinker K-8 – photo by Airman 1st Class Danielle Quilla

highly mobile students through the evaluation of multiple migrant educational programs and programs that serve military connected students.

Over 80% of military dependent students attend public schools, many of which are base-adjacent. And military families move often: the average military child moves six to nine times between the start of kindergarten and high school graduation, mostly between states. It’s not difficult to imagine the challenges of navigating differing school schedules, class sizes, immunization and other health requirements, and the transfer of credits from one school to another. (more…)

Continue reading

Our clients are by and large genuine change-makers, motivated to measure and achieve the positive outcomes they seek. And one of our most important jobs is helping them develop and use appropriate data to enhance discovery, analysis, insight and direction. But client commitment and our professional responsibility don’t always avoid some common data collection

Through countless evaluations of school and district level educational programs, as well as multi-site, statewide initiatives, we have identified the pitfalls that follow. They may seem like no-brainers. But that’s what makes them so easy to fall into, even for seasoned evaluators and educational leaders. We highlight them here as a reminder to anyone looking to accurately measure their impact:

1. Asking leading a question versus a truly open-ended one. If you aim for honesty, you must allow respondents to give negative responses as well as positive ones. For instance, asking:

“How would putting an iPad into the hands of every student in this district improve teaching and learning outcomes?”

…assumes teaching and learning outcomes will be improved, at least to some degree. (more…)

Continue reading
Back to top