Posts by Kirk Vandersall

February 11, 2021

Arroyo Research Services is pleased to invite applications for three new positions to our growing national education research, measurement and evaluation practice: A Senior Associate who will lead projects and engage with senior education leaders, an Associate who will conduct evaluations and provide a variety of research and evaluation services to multiple projects, and a Research Associate who will contribute data collection, analysis and writing.

About Arroyo
Arroyo Research Services is an education professional services firm that uses the tools of social science to help school districts, state departments of education, universities and education organizations achieve the highest standards of educational excellence and equity. Our services include research, measurement and evaluation, strategic planning, technical assistance and related data systems. Our work includes programs designed to advance educational outcomes for underrepresented youth from birth through college, serve migrant youth and families, build optimal health among teens, increase STEM learning and educational attainment, and promote STEM-related innovation. Based in Asheville, North Carolina, the firm was founded in 2005 in Los Angeles.

Who We Serve
We care most about projects that seek positive social change through educational programs. These include programs designed to improve educational outcomes for the children of migrant farm workers, providing a pathway to and success in university STEM programs and careers, promoting women in technology, achieving optimal health among teens in distressed communities, and ensuring access to culturally responsive teaching of mathematics and science. Recent clients include the State Education Agencies of Arizona, Arkansas, Colorado, Florida, Georgia, Hawaii, Illinois, Iowa, Kentucky , Maine, Maryland, Mississippi, Montana, New York, Ohio, Pennsylvania, Texas, and Virginia; school districts including Fairfax County Public Schools (VA), Hillsborough County Public Schools (FL), Columbus and Dayton Public Schools (OH), Baltimore County Public Schools (MD), and Socorro Independent School District (TX); organizations that include The Gallup Organization, Learning Forward, Hudson Alpha Institute for Biotechnology, Strada Education Network, and Reach Virginia; and universities including California State University, Fullerton, University of South Florida, University of Virginia, State University of New York and others. Our work is funded by the National Science Foundation, U.S. Department of Education, National Institutes of Health, Department of Defense Education Activity, the Sandler Family Foundation, Apple Federal Credit Union Education Foundation and others.

How We Work
Work at Arroyo is both collaborative and independent. You work with a remote team with diverse experience across the United States. Team members are equipped as professionals with the computers, software, phones and connectivity to succeed. We use a variety of industry standard communication, project management and analysis tools including Google Workspace, Zoom, Microsoft Office, Asana, Harvest, R statistics, and others. You will receive regular feedback on your work designed to help you grow as a professional. Team members typically work on multiple projects at the same time which keeps the work interesting and provides areas of both comfort and growth.

Who Should Apply
Arroyo Research Services is committed to inclusion and diversity and is an equal opportunity employer. We welcome applications without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, Veteran status, or other legally protected characteristics. DACA recipients are encouraged to apply.

To Apply

  1. Review the specific job description: Senior Associate, Associate, Research Associate
  2. Submit an email to that includes a cover letter and resume. Include the position title in the email subject line. Make sure document file names include your first and last name, e.g. or similar.
  3. We will respond to each submission with next steps.
Continue reading

When asked publicly or privately about high stakes assessments for teachers and schools, we always say the same thing: don’t go there. Using value-added models based of student test scores to reward or punish teachers misdiagnoses educator motivation, guides educators away from good assessment practices, and unnecessarily exposes them to technical and human testing uncertainties. Now, to be clear, we do use and value standardized tests in our work. But here’s a 10,000-foot view of why we advise against the high stakes use of value-added models in educator assessments:

  1. Using value-added scores to determine teacher pay misdiagnoses teacher motivation.

When Wayne Craig, then Regional Director of the Department of Education and Early Childhood Development for Northern Melbourne, Australia, sought to drive school improvement in his historically underperforming district, he focused on building teachers’ intrinsic motivation rather than the use of external carrots and sticks. His framework for Curiosity and Powerful Learning presented a matrix of theories of action that connect teacher actions to learning outcomes. Data informs the research that frames core practices, which then drive teacher inquiry and adoption. The entire enterprise is built on unlocking teacher motivation and teachers’ desire to meet the needs of their students. (more…)

Continue reading

Our past two posts covered both the “why” of measuring implementation and some of the common challenges to doing so. In this third and final post, we’ll look at what is most useful to measure.

Implementation measures are particular to each program and should take into account the specific actions expected of program participants: who is doing what, when, where, how often, etc. Participants may be teachers, students, administrators, parents, advocates, tutors, recruiters, or institutions (e.g., regional centers, schools, community organizations). Specific measures should help stakeholders understand whether, how, and with what intensity a program is being put into place. Moreover, for programs with multiple sites or regions, understanding differences among them is critical.



Continue reading

In our last post, we shared four reasons why educators should be measuring implementation: here we’ll look at four common challenges to strong implementation measurement.


1. Differential definitions. What happens when different units of your program operate with different working definitions of a measure?

Take tutoring, for example, in a multi-site program, where each site is asked to report the number of hours per week a participant is tutored. Site A takes attendance and acknowledges that, although the after school program runs for 1.5 hours, only .5 hours are spent tutoring. So Site A reports the number of days a student attends, multiplied by .5: e.g., if Jose attends for 3 days, Site A reports 1.5 hours of tutoring. Site B calculates 1.5 hours of tutoring per day times 5 days per week, per participant: So if Jose is a participant that week, regardless of how often he attends, Site B reports 7.5 hours of tutoring. (more…)

Continue reading

Understanding implementation is critical to both program improvement and program evaluation. But measuring implementation is typically undervalued and often overlooked. This post is one of three in a series that focuses on measuring implementation when evaluating educational programs.


“Fidelity of implementation” ranks next to “scientifically based research” on our list of terms thrown about casually, imprecisely, and often for no other reason than to establish that one is serious about measurement overall. Sometimes there isn’t even a specified program model when the phrase pops up, rendering fidelity impossible. Other times we think all stakeholders are on the same page and so don’t bother to measure implementation at all.

That should change. Here’s why. (more…)

Continue reading
Back to top