Engage in Civil Discourse. Find Common Ground. Learn more at TheLineK12.com

Lessons of Discovery: Evaluating Professional Learning Begins Before Step One

Professional Growth

How do we know if professional learning is paying off?

This is a question policy makers and education leaders are asking frequently, especially when engaging in planning for the upcoming year’s calendar or writing or updating school and district improvement plans. It is easy to plan professional learning, and often viewed as a necessary step in upgrading educator practice, yet the proverbial question about its impact remains a mystery beyond the simple process evaluation.

In 2017, a group of school systems joined in a partnership led by Frontline Research & Learning Institute and Learning Forward to explore how to answer the question of impact of professional learning. Their journey to answer the question from multiple and diverse perspectives continues on.

Some discovered that before evaluating professional learning, it is essential to determine if the plan or process for designing and implementing professional learning makes sense and has potential for producing results. Others found unique ways to analyze qualitative data to make decisions related to professional learning for a districtwide initiative. Still others used formative data to make adjustments in their professional learning program to increase the potential for results.

These districts discovered that creating a plan to assess the impact of professional learning must begin sooner than they anticipated. They found, first, that it is essential to know the overarching purpose for evaluation. Second, they discovered that most districts did not have a comprehensive professional learning plan to evaluate and first needed to consider how to develop one — and then do so. Only once they had a process for designing a professional learning program, and ultimately, a professional learning plan in place, were they ready to begin the actual evaluation process.

Purposes for Evaluation

It became clear to participating districts that they hadn’t considered the purpose for evaluation; they just had a desire to do so. Evaluation, they found, can take many shapes. Not all evaluation types focus on measuring impact or the changes that occur as a result of the intervention.

Problem-focused

Leaders might engage in problem-focused evaluation, which uses data to uncover the need or define the problem to address with precision before planning professional learning intervention. A problem-focused evaluation seeks to answer the question: Does the problem we are seeking to solve align with evidence of need?

Quality

Evaluation might also be to useful to assess the quality of a plan. One research-based set of criteria for effective professional learning would be the Learning Forward Standards for Professional Learning. Numerous states and school systems have adopted the Standards and others have adapted them. Other sets of standards exist as well.

To know if a plan for professional learning is likely to generate its promised results, clear and rigorous criteria are necessary. Evaluations of this type seek to answer the question: Does the plan meet criteria or standards for high-quality professional learning?

Implementation or Process

A third purpose for evaluation focuses on implementation or process. This form of evaluation uses data to account for the operation of the professional learning program. Evaluators gather evidence to know:

  • if the program is working as planned,
  • if there are unanticipated surprises, and
  • if those involved are meeting their expectations.

These data are then used to make adjustments in the plan to increase its likelihood of meeting the defined goals and outcomes. Some might call this formative evaluation, yet while it is a form of formative evaluation, it is not focused on outcomes, but rather on operation. This type of evaluation seeks to answer the question: Is the program operating or being implemented as we planned or intended?

Effectiveness1

Another purpose for evaluation is focused on effectiveness1. Effectiveness1 measures the first level of impact: changes in participants. This form of evaluation uses data to measure progress toward or degree of change in the defined outcomes for participants, which can be defined as KASABs (knowledge, attitudes, skills, aspirations, and behaviors). It can be used as a formative evaluation when it examines progress toward the outcomes, or more as a summative evaluation when it measures the degree of change.

This type of evaluation seeks to answer the question: Are participants demonstrating anticipated changes and moving closer toward the desired outcomes?

Effectiveness2

Effectiveness2 is another purpose for evaluation and the second level of an impact measure. It focuses on the impact of the professional learning on the clients of participants.

If participants are administrators, their clients are most likely teachers and students. If participants are teachers, their clients are students. Professional learning program outcomes delineate the expected changes (KASABs) in clients of participants, and an effectiveness2 evaluation uses data to measure if those changes are occurring or have occurred. An effectiveness2 evaluation can be either formative or summative, and seeks to answer the question: Did the changes in client KASABs, as outlined in the professional learning program, occur?

There are other purposes of evaluation, such as planning, efficacy, and social justice and human rights. Another blog at a later date will address these purposes for evaluation.

While determining the purpose for an evaluation appears to be a relatively easy decision to make, it is also a place where politics, power, and priorities come into play. The purpose for the evaluation and the corresponding evaluation questions may be in conflict or overwhelming. The questions that a director of professional learning wants to answer (i.e., is the program working as planned?) may not align with the questions a board of trustees or directors might want answered (i.e., is the return on the investment worth it?). Seeking clarity at this stage will eliminate headaches and confusion later in the process.

Step 1: Assess Evaluability

For professional learning to make a difference, it must be designed in a way that is likely to make a difference. And for professional learning to be evaluated for its effectiveness, it must be evaluable. Once the purpose for evaluation is determined, then evaluators assess the evaluability of a professional learning program.

The term evaluable means that a program of learning for educators meets the core criteria of being likely to change educator practice and results for educators’ clients, students. Evaluable programs have:

  • clearly articulated goals,
  • outcomes for participants and their clients,
  • standards and indicators for success of the outcomes,
  • a theory of change that explains the rationale for the program’s design and defined actions to achieve each outcome, and
  • a logic model to map early and late stage changes and necessary resources for the program’s operation.

Unfortunately, in some cases, professional learning fails to produce its intended results because it is poorly planned, includes insufficient actions over time to achieve the outcomes, or is poorly resourced. If the program’s plan includes clearly identified components, then it is ready for evaluation and likely to produce the intended effects.

Step 2: Formulate Evaluation Questions

Once the professional learning plan is deemed evaluable, evaluators use the goals and outcomes to formulate the questions for the evaluation to answer. The process of writing evaluation questions is easy if the goals and outcomes are precise and results focused.

For example, if the outcome is:

By the end of Year 1, 90% of the teachers will plan and implement three units of instruction that integrate pedagogical strategies related to the eight mathematical practices to develop students competencies in reasoning mathematically, explaining their thinking, applying mathematics in authentic situations, and persist in classroom tasks and that meet established criteria for quality and completeness.

The evaluation question for this effectiveness1 evaluation then becomes:

At the end of Year 1, did 90% of the teachers plan and implement three units of instruction that integrate pedagogical strategies related to the eight mathematical practices to develop students competencies in reasoning mathematically, explaining their thinking, applying mathematics in authentic situations, and persist in classroom tasks and that meet established criteria for quality and completeness??

There is likely a parallel effectiveness2 question about student success within each unit using both formative and end-of-unit assessments.

Potential challenges to writing evaluation questions from a sound professional learning plan are the skillfulness of the evaluators and the intricacies of the design needed to answer the questions. Evaluators will want to know the degree of rigor their evaluation requires, or the permissible variance in validity and reliability, before they begin planning an evaluation framework.

If the evaluation is for a school leadership team to measure the effectiveness of the school improvement plan that includes professional learning, the evaluation may not need to be as stringent as one that is a required part of an externally-funded initiative. If practitioners are conducting evaluation for their own internal purposes rather than for external audiences, they may approach their evaluation work through the lens of learning about their work. While they must commit to respect human subjects and other ethical and operational standards for evaluation, their work can be more practical and learning-focused.

Designing the evaluation plan is the next task for evaluators to undertake, and it requires precise evaluation questions before the evaluation framework, or plan for how to conduct the evaluation, can be designed.

You may also be interested in:

eBook: “Professional Develop Program Evaluation for the Win! How to Sleep Well at Night Knowing Your Professional Learning is Effective.”

Next Steps

Knowing if professional learning generates the impact it expects takes careful, deliberate planning before an evaluation can be conducted. The discoveries of the districts that participated in the Measuring the Impact of Professional Learning initiative offered useful insights into the challenges of evaluating professional learning and unique ways to address these challenges, which often led to unexpected information and more opportunities for inquiry than anticipated.

As the districts tell their stories in this series of blogs, education leaders are invited to ask these questions to make discoveries that will be useful to them:

  • What is the purpose of the evaluation this district proposed?
  • What discoveries did the district uncover?
  • What questions were the districts seeking to answer?
  • What challenges did the district encounter and how did they address these challenges?
  • What are you discovering that can inform your own work?

 

For further reading from Joellen Killion:

BOOK: Assessing Impact: Evaluating Professional Learning, 3rd edition

BLOG POST: Measuring the Impact of Professional Learning

Joellen Killion

Joellen Killion is a senior advisor to Learning Forward and served for many years as the association’s deputy executive director. As senior advisor, she leads initiatives related to the link between professional development and student learning. She led the most recent revision of the Standards for Professional Learning, and has extensive experience in planning, design, implementation, and evaluation of professional learning at the school, system, and state/provincial levels. She works with coaches, principals, district and state leaders to support understanding and embedding standards-based professional learning in a system. The author or co-author of numerous books, her most recent book, published in 2015, is The Feedback Process: Transforming Feedback for Professional Learning.