Internal Research Capacity Within an SEA

Rebecca Lavinson, former Policy Associate, AYPF

INTERNAL RESEARCH CAPACITY WITHIN AN SEA:
A Learning Journey to the Massachusetts Department of Elementary and Secondary Education

By Rebecca Lavinson, former Policy Associate, AYPF

With special thanks to contributors Kendra Winner, Research and Evaluation Coordinator, Office of Planning and Research, Massachusetts Department of Elementary and Secondary Education, and Carrie Conaway, Senior Lecturer at the Harvard Graduate School of Education (in her capacity as former Chief Strategy and Research Officer, Massachusetts Department of Elementary and Secondary Education) for information on the development and workings of the Massachusetts Department of Elementary and Secondary Education’s internal research office and research strategy.

Thanks also to partner organizations for their help in creating this learning journey, with special thanks to Sara Kerr, Vice President of Education Policy, Results for America, and Paula Arce-Trigatti, Director, National Network of Education Research-Practice Partnerships.

******

In July, a group of 30 researchers and state-level policymakers representing 10 State Education Agencies (SEAs) – the majority of which were part of the RFA State Education Fellowship program – and various organizations traveled to the Massachusetts SEA in Malden to participate in an American Youth Policy Forum (AYFP), Results for America (RFA), and National Network of Education Research-Practice Partnerships (NNERPP) led learning journey. During this day-long event, participants visited the Massachusetts Department of Elementary and Secondary Education (DESE) and learned about building internal and external research capacity. DESE program offices and partners discussed the successful collaboration between the program office, the research office, and external researchers. Participants also learned about DESE’s evolving efforts to support the use of data and evidence to drive decision making and improve student outcomes. Participants walked away from the learning journey with concrete ideas and steps on how to improve their own internal research offices building on DESE’s example.

The group visited DESE due to the unique relationship between its internal research office and its program offices, which has developed and strengthened over time. DESE’s research office plays a critical role in coordinating between the program offices and external researchers throughout all stages of the research process, acting as an intermediary and improving DESE’s research capacity and effectiveness. DESE’s unique beginnings set new expectations for the role of the research office and program offices, which are now ingrained in DESE’s operations.

The Beginning

While DESE’s current internal research office was established in 2007, the idea of creating a research office was not new. DESE previously had a research office in the early 1990s, but changes in education reform shifted personnel from the research office to the assessment office. However, over time, Massachusetts’ longitudinal data systems – including assessment data and student demographic and program data – started to mature, and DESE’s deputy commissioner saw the value that these data systems could provide in improving state policy with more dedicated resources. In 2006, a director of research position was posted and in early 2007, the inaugural director was hired. In this post, the director was tasked with building a team dedicated to learning about and from the agency’s work and feeding that learning back into the policymaking and planning process.

Two years later, in 2010 DESE received Race to the Top (RTT) funding which brought with it approximately $9M to fund research and evaluation on DESE’s RTT activities. RTT research funding, unlike previous research funding streams, was located within the research office as opposed to the program offices. This deviation had some important implications for the program offices, including: 1) the allocation of the research funding was made by the research office to reflect the highest priorities in the RTT proposal, and 2) the research office worked closely with program staff across the agency to design the 22 plus evaluations that came out of the RTT funding. This new process to determine the research needs of program offices lead to greater collaboration between the research office and program offices across DESE. Program offices that had never engaged in research or worked with the research office began partnering directly with the research office to define the research they wanted to conduct. Combining the expertise of research team members and program staff members became the standard way in which DESE engages with external researchers, providing both substantive expertise from the program offices and research expertise from the research office.

The research office staff members increased their involvement in the process of developing the research questions with program offices, contracting external researchers, formulating the deliverables, and approving any changes along the way. This influx of RTT research funds led DESE to expand and refine its system of working with external researchers to require greater collaboration between the program offices and research office. The volume of RTT research activities created an opportunity for many program offices across the department to work closely with the research office and establish a new standard for research collaboration across the agency.

Current practices

MA’s RTT research strategy laid the foundation for DESE’s current strategy for collaboration between its internal research office and program offices, with increased involvement, collaboration, and cooperation between the research office and the program offices. This enhanced process stayed intact even after the end of RTT research funding when research funding again came primarily from the program offices. Today, DESE’s internal research office continues to be heavily involved throughout the entire research process, and contributes to identifying research questions and contracting external researchers, providing advice and project management during the research process, and engaging in continuous improvement.

1. Identifying research questions and contracting: DESE’s research office works jointly with the program offices to identify research questions. The research office is involved in the entire procurement process from drafting the bid with the appropriate research deliverables to selecting the external researchers and approving the contract. In an advisory role, a representative from the research office is present for all of the interviews with the external researchers to discuss the deliverables, clarify the understanding of the research services to be provided, and make any appropriate changes. As part of the contracting process, the DESE research and program offices jointly conduct interviews with research finalists. As part of the interview process, the research office requires the selected bidder to write a memo that reflects any new services or major changes to the research plan (that come up in the interview) prior to executing the final contract. This memo becomes part of the formal contract along with the original bid from the vendor and is to be signed by both parties to ensure there is a clear agreement on the research that is to be completed. The contracts include multiple deliverables to be presented throughout the period of the research study and specify the frequency of check-ins, in which the research office will take part. The research office staff members have the expertise to assess the financial cost of the research and how to best address the research question within the allocated budget. DESE’s research office helps the program offices understand what to expect both from bidding vendors and the evaluation process, how far the available budget can stretch, and any necessary trade-offs between the research questions the program office hopes to have answered in light of the budget.

2. Providing project management during the research process: Once the contract is executed, the research office continues its involvement by participating in the check-ins between the program office and external researchers and reviews, edits, and approves all final deliverables, including but not limited to the final research plan and budget, data collection protocols, quantitative analysis strategies, interim reports, and final reports. During the evaluation process, the program office staff looking for advice about particular methodological decisions, such as if it is better to substitute additional interviews for focus groups, tends to rely on the research team to identify and address any research challenges that arise. Check-ins are typically guided by a clear document that outlines the required decisions and activities of all the players (DESE research, DESE program office, and external research team) during the meeting, the timeline and allocation of the decisions and activities, and the completion, resolution, and level of priority for each item. DESE provides all of its research partners with a template, but each external research team creates its own spin on this template.

3. Engaging in continuous improvement: The research office collaborates closely with the program offices throughout the process of the evaluation, ensuring that the research continues to address the needs of the program office. The research office reviews all deliverables that were agreed upon in the contract, ensuring any changes in the evaluation that the program offices require are made as needed over the course of the evaluation. The goal of every evaluation is to create actionable information that program offices can use to guide their activities. For example, data gathered and reported in the first wave of data collection may have implications for the kinds of questions asked in subsequent data collection, an example being focus group findings that surface new questions to be included in surveys and interviews. Using this strategy, the program offices and researchers are able to continuously improve the evaluation throughout its duration.

The relationships, processes, and procedures established during RTT became standard practice at DESE. Program offices interested in doing research continue to engage with the research office. Best practices developed during this period that DESE still follows include 1) constant collaboration between program offices and external researchers, with the research office acting as an intermediary, 2) the use of mixed methods evaluations which include both formative and summative research, 3) the inclusion of multiple deliverables in the contract for multiple audiences (e.g. researchers and MA educators), and 4) continuous improvement by ensuring key information gathered as part of the evaluation is used to make improvements to the work of the program office throughout the course of the evaluation. According to Kendra Winner, Research and Evaluation Coordinator within the Office of Planning and Research at DESE, the relationship between the research office and program offices is a “partnership in every sense of the word.”

Recommendations from DESE

DESE focuses on specific aspects of their research to ensure a smooth evaluation process. As part of the publication Building Agency Capacity for Evidence-based Policymaking, the SEA of the Future, DESE emphasizes: 1) selecting the right programs for evaluation, 2) requiring feedback and deliverables throughout the evaluation to support continuous improvement, 3) producing school- and district-level reports, and 4) developing concrete public tools for educators to utilize.[1]

1. Select the right program(s) for study

SEAs have limited resources that need to be used wisely to produce the best outcomes. When selecting which programs to evaluate, program offices should ensure the selected programs are central to the agency’s strategic priorities and the research will directly inform the work of the agency. The programs selected should also be long-term and modifiable so that there is the possibility of adjusting the programs based on the intermediate research findings during the evaluation process. These principles for selecting programs for study ensure that the research is conducted on programs that can be improved and made more effective, and that resources are spent where they can be of greatest value in improving DESE’s work.

2. Request multiple deliverables for continuous feedback

Having deliverables that reflect the agency’s timing and goals for the research is important to the process of continuous improvement. Instead of a single final report, DESE routinely asks for summary reports after each round of data collection to enable program offices to more quickly access research findings and start the improvement process. DESE requests documents such as internal memos, briefings for program leadership, and short summaries of findings intended for district audiences. Some type of field-facing report is almost always a deliverable for DESE’s work with research partners so that DESE can more easily communicate findings with agency staff, superintendents, principals, and other educators and stakeholders. This process allows DESE to stay current on the findings and enact changes to the program during the evaluation period.

3. Produce school- and district-level results

DESE requires that researchers produce school- and district-level results, not just statewide aggregations, naming the schools and districts whenever confidentiality is not a concern. DESE also disaggregates results for key subgroups, such as student demographic groups or types of educators. These analyses provide the opportunity to examine the variation in program implementation and to gain insight into the conditions under which programs work best. This gives DESE better insight into the context of the findings and the ongoing research feedback informs the programming, technical assistance plans, and any other work that is informed by findings.

4. Develop concrete tools

DESE looks for opportunities to turn research into concrete tools for educators, schools, and districts. As an example, the research team analyzing the educator evaluation implementation developed a methodology for assessing whether educators’ evaluations met various criteria of the statewide evaluation system. DESE is utilizing this criteria data to create a toolkit to support districts in conducting their own educator evaluation implementation self-assessments. Similarly, DESE has made public a host of evaluation tools created as part of the statewide educator evaluation system research. The goal of providing these tools is to support districts in conducting their own research on the perceptions their own educators have of the educator evaluation system. In a final example, the department used its own resources to create a variety of student survey tools for districts and schools to use to gather information from students about their instructional experiences. Developing and providing concrete tools based on its own research enables DESE to more efficiently leverage the dollars invested in research to support the work of educators across the state.

DESE follows these recommendations to support its capacity for generating and using evidence to improve the department’s effectiveness. DESE’s internal research capacity, allocation of responsibilities across and between the research office and program offices, and the collaborative culture of the research office has supported DESE’s ability to perform valuable research to inform its education policies.

Based on the information DESE provided during the event, the learning journey participants discussed the actions they would take to improve their own research capacity including: 1) increasing the intentionality of their work by prioritizing their research based on the primary educational priorities of their state, 2) improving communication and collaboration across the research office and program offices to break down silos, and 3) building their internal research capacity by creating a research office and/or hiring more researchers, with the ultimate goal of improving student outcomes.

Thanks to the William T. Grant Foundation for their generous support of this work. For more guidance and resources on forming or growing partnerships, log on to http://rpp.wtgrantfoundation.org

[1] For more detailed information regarding these recommendations, please visit the publication: Gross, B., and Jochim, A. (eds.). (2015). Building Agency Capacity for Evidence-Based Policymaking. The SEA of the Future, 5. San Antonio, TX: Building State Capacity & Productivity Center at Edvance Research, Inc.

____________________________

The American Youth Policy Forum (AYPF), a nonprofit, nonpartisan professional development organization based in Washington, DC, provides learning opportunities for policy leaders, practitioners, and researchers working on youth and education issues at the national, state, and local levels. AYPF events and publications are made possible by contributions from philanthropic foundations. For a complete list, click here.