Skip to content
Professional Growth

Measuring the Impact of Professional Learning

Share article
Measuring the Impact of Professional Learning

Investing in professional learning for educators comes with the expectation that you’ll be able to evaluate the gains from that investment. But all too often, rolling out a learning plan without an evaluation method means that the second part never happens.

That’s why the Frontline Research & Learning Institute, along with Learning Forward, worked with six districts to examine the best way to measure the impact of professional learning.

Those districts were:

  • Boston Public Schools, Massachusetts
  • Greece Central School District, New York
  • Jenks Public Schools, Oklahoma
    Metro Nashville Public Schools, Tennessee
  • Prior Lake Savage Area Schools, Minnesota
  • Shaker Heights City Schools, Ohio

Each district brought an existing professional learning program to this small-scale study with the intention of collaboratively determining which changes to the programs would most likely benefit educators at their school. Their five essential findings might surprise you.

1. Plan evaluation as a holistic part of the program

To evaluate effectiveness of professional learning programs on both educators and their students, the program needs to be designed with clear targeted, measurable outcomes and indicators of success. In wrestling with this fact, the districts fell into three buckets of program development:

  • Existing programs with evaluable outcomes in place.
  • Existing programs that required reworking for more clarity of outcomes.
  • In-development programs that didn’t yet have targeted outcomes.

Similar to how educators develop units of learning, planning their final assessments first, these district leaders found it was necessary to do that same. While a “one-off” professional learning event may seem like a good idea, the bigger question is: how does it all fit together?

2. Develop targeted outcomes with existing data sources in mind

In developing their target outcomes, Boston Public School leaders looked to the multiple data sources they already had at their fingertips in the district. They considered which might be useful as indicators of the impact of their new professional learning program. Repurposing data sources in this way can help you more easily and quickly evaluate the program without creating a new data burden (Killion, 2018).

The down side: data sources already in place are approximations of measures of the targeted outcomes of the professional learning program. So, district leaders worked to analyze and interpret the results, then form conclusions about the impact of the program.

Weigh how important exacting data tailor-made for your program is vs. the functional ease of using existing data sources. Consider a combination of the two, if necessary.

3. Consider a systems-approach to better evaluate effectiveness. from its inception.

When professional learning within a district or school lacks coherence — that is, when each event or initiative feels ad hoc or separate — it’s pretty difficult to measure its effects. That’s why Jenks Public Schools took the opportunity to revise their program using a systems-approach. They reworked the planning process to ensure that professional learning met their criteria for quality

Using this planning model, the district and school leaders aligned professional learning with identified needs, provided adequate implementation support, and monitored implementation to increase the likelihood of results.

4. Continually evaluate both new and existing programs

The reasons for evaluating a new program are obvious:

  • Determine if it’s worth the investment to continue into a second year
  • See how to refine the program
  • Incorporate feedback from participants

But what about after the first year or so, when you feel it’s going well?

The districts in the study found it helpful to continue to evaluate existing programs in the following ways:

  • Run annual data collection from multiple sources about the program to inform continuous upgrades, even after it’s been refined for a year or so.
  • Go a step further to measure the impact of the program on student achievement, connecting the dots between program outcomes and changes in student learning.

Metro Nashville Schools collected data using the Collaborative Inquiry Process in partnership with REL Appalachia, a system for collecting, analyzing, and using a variety of data to improve programs.

These modes of evaluation helped the districts stay freshly engaged with programs, even if they had been running for more than a year or so. Continuous data collection meant they could go a step further in guiding teachers who were implementing learnings, too.

5. Use reliable and flexible systems for data collection and evaluation

Useful evaluations can take time, resources, and effort. Districts with data systems that allow them to gather, track, analyze and access data quickly are able to monitor the program’s success or needed adjustments more easily.

Data systems that generate analytics using multiple types of educator and student data allow district leaders to see the best way to adjust a program more clearly. This ease shifts the focus from collecting data to analyzing it — a much more effective use of time.

In this small-scale study, the six districts looked at their professional learning programs, however established or nascent, to collaboratively examine methods of evaluating those programs. Rolling evaluation into the holistic design process, beginning with targeted outcomes, taking a systems approach, continually evaluating the program, and using a solid data system all felt like the most important pointers to take away from the study to run an impactful professional learning program for educators.