Skip to Content

Building data systems to track and improve outcomes

Prep programs often survey student teachers about their experiences in clinical practice, but data collection efforts may not go further than that—meaning that the programs themselves, as well as districts and states, miss an opportunity to learn about what’s working in clinical practice and leverage the data to inform staffing decisions and drive improvements. Building out these data systems is ideally done in concert by prep programs, districts, and states, but there are many examples of individual entities developing systems to gather data and, more importantly, use the data to drive change.

Southeastern Louisiana University 

This institution uses data to inform program improvement at all levels.

  • For residents: Residents engage in a POP cycle (pre-observation meeting, observation, post-observation meeting) with their site coordinator (i.e., program supervisor). For example, prior to the pre-conference meeting, residents collect student data on several specific reflection questions, and the program supervisor will probe deeper in the data with questions like, “10% of the class fell far below expectations in this area, what can you tell me about why they struggle in that area?” They revisit assessment data in the post-observation meeting to identify how students have grown and what did and did not work.
  • For program supervisors: Known within the program as site coordinators, this role reviews data across all their residents, looking for trends in the feedback and ratings they give to residents. The prep program uses insights from this data across program supervisors to identify areas where supervisors need additional professional learning. For example, residents consistently scored low on the “questioning” indicator on their observation rubric, and so the university partnered with the National Institute for Excellence in Teaching (NIET) to facilitate a workshop on questioning skills. All faculty attended the workshop, based on the premise that “if we don’t have a common language among instructors of freshmen courses, we can’t expect instructors of senior courses to do something different.” Program leaders are also working to develop program supervisors’ ability to analyze their own data.
  • For districts: District staff members attend the quarterly governance meeting (held  among the prep program and the 10 districts with which it partners for full-time clinical practice) to receive data about all of their students and residents. This includes residents’ evaluations of K–12 student achievement, self-assessments, feedback from cooperating teachers, and insights regarding PK–12 student achievement.
  • For the prep program: In addition to reviewing all the data described above, the program also surveys student teachers, cooperating teachers, and district stakeholders at the end of each semester of their residency to gather feedback about the program, their site coordinator, and their mentor teacher.
  • Read more in the Southeastern Louisiana case study

Spokane Public Schools 

Spokane developed a student teaching portal that connected to existing data systems, including HR data on teachers’ evaluation ratings and years of experience, the district database that tracks people’s involvement with the district (e.g., working as a volunteer or substitute), and the system that tracks background checks and clearance processes. The portal streamlines many steps of the student teaching placement process and could be augmented to track additional information (e.g., more insight into why potential cooperating teachers decline to host a student teacher, observation ratings associated with student teachers, whether student teachers go on to take jobs in the district and where).

Arkansas

Forward Arkansas is working with prep programs in its EPP Design Collaborative to build stronger data systems. As Dr. Malachi Nichols, director of data and strategy, put it, “If Target knows when my toddler needs diapers, we should know when we have a cohort of 8th grade teachers who have been in the district for 40 years and next year they’re going to retire. We should have the data to inform the pipeline, we should be able to forecast gaps in the pipeline moving forward.”

In Arkansas, much of the data already exists at the state level to know where student teachers are, where they’re hired and what they’re teaching, but prep programs and districts may not be aware enough of this data to make strategic decisions. In its work with Southern Arkansas University and the University of Arkansas at Little Rock, Forward Arkansas is bringing greater attention to data related to three standards in the state’s review: who the program is recruiting, what’s happening in their preparation (including coursework and licensure test passage rates), and what the outputs are (where candidates are getting hired, whether they are persisting). This work extends beyond clinical practice to examine whether candidates who complete a program are still teaching several years later. Over the next three years, every institution will now be collecting this data to meet the new program approval criteria.

Michigan 

Michigan has several data system elements in place already, with the potential to further connect and refine them. The state developed its own student teaching portal to administer student teaching stipends. While this system is not as robust as what Spokane Public Schools developed, it holds potential for future data collection efforts. The state also has a process in place to gather detailed data about clinical practice experiences (to inform its educator preparation institution report performance scores) and to share back program alumni’s evaluation ratings with prep programs, which can support their selection of cooperating teachers.

Texas 

Texas stands out for its work to build strong, and transparent, data systems. The state publishes a set of educator prep program data dashboards on a range of topics, including candidates’ completion rates, employment, and retention outcomes. The data system also provides indicators related to clinical practice, including the frequency and quality of observations. In the next phase of work, the state plans to provide more specific information about program experiences related to clinical practice to better understand which program activities (e.g., required hours of clinical practice, number of observations) are associated with better outcomes.

Tennessee

In Tennessee, researchers were able to use existing state data to build an algorithm to inform selection of cooperating teachers. The project found that providing districts with recommended placements and cooperating teachers resulted in more effective and experienced teachers being recruited to serve as cooperating teachers, and candidates in these districts felt better prepared. Nate Schwartz, who was part of this project team and currently works with EdResearch for Action, identifies enabling conditions for this work:

  • State engagement in partnerships: The state was heavily engaged in building partnerships between prep programs and districts, going beyond issuing guidelines to actively supporting those partnerships. These relationships built the trust and interaction that facilitated the piloting of this algorithm.
  • Researcher with deep familiarity with state data: The head researcher on this work, Matt Ronfeldt, worked in the state for years and became deeply familiar with the state’s data. Building off his work building prep program report cards, he was able to identify new opportunities to use existing data to drive improvements in the teacher preparation system.
  • Substantial teacher effectiveness data: Tennessee had statewide data on teacher quality, including both observation ratings and value-added ratings based on student test scores, creating an opportunity to use the data to identify potential cooperating teachers across the state.
  • Debate and curiosity about what works best: At the outset of this work, the education community was debating whether it was more important to focus on training available cooperating teachers or on selecting the best cooperating teachers. The results suggested that selecting the right cooperating teachers was the optimal approach. Curiosity helped fuel the work, leading to a revolution in how not only researchers, but also leaders from prep programs and the state, approach the recruitment and selection of cooperating teachers.

Read more about the experiments in Tennessee: