Moving the Needle:
Leveraging Legitimate EE Flexibility to Grow Teacher Practice and Student Learning

by Dr. Joe Schroeder, Associate Executive Director, AWSA

Educator Effectiveness (EE) has been a key component of Wisconsin’s Education landscape for over five years.  Policies, platforms, and processes have adapted over that time, primarily in response to feedback directly from Wisconsin educators who identified ways to make the process more meaningful, authentic, and impactful.  But throughout the years, two constants have remained the same:  (1) the founding EE purpose to improve practice and support for adults so that student outcomes improve and (2) the abiding reality that EE implementation requires a load of work.  The purpose of this article is to update you to current EE themes we are learning from the field, reinforce legitimate DPI flexibilities at your disposal to increase local EE impact, and highlight fruitful practices by a number of Wisconsin leaders for consideration by districts across the state so that we can increasingly “move the needle” for student learning via EE.

Statewide evaluations of the WI EE System indicate that implementation as intended (i.e., a continuous improvement cycle embedded within daily practice) positively impacts a variety of outcomes (e.g., school climate and culture, educator satisfaction and retention, and changes in educator practice to improve student outcomes).  However, the evaluation also finds that if the EE System is implemented without a focus on growth or best practice, the opposite is true.  Specifically, school climate and culture suffer, educators are unhappy and leave the profession, and practices and outcomes do not change.  And either way, the process is time-consuming and difficult.  In recent years, I have written and spoken on numerous occasions to encourage Wisconsin leaders to leverage flexibilities within Wisconsin’s EE system so that leaders can positively impact adult practice and student learning, thereby ensuring significant impact from all the EE effort.  This article continues and updates this ongoing effort.

EE Implementation Themes

This past December, in order to further assess the current state of EE in the field and the state policy that supports it, AWSA convened a gathering of Wisconsin education leaders.  This December 2018 EE Thought Leaders group brought together twenty participants, representing school- and district-level leaders from regions across Wisconsin.  After ensuring common background about the origins of EE, confirming flexibilities from previous DPI communications, and sharing examples of actual flexibility implementation in the field, we asked participants to identify major EE themes based upon their current experiences and observations: 

Notable Themes of EE Implementation from Dec. 2018 EE Thought Leaders Group

  1. EE has focused local school professional development efforts on observable instructional strategies
  2. Schools/districts are seeing higher EE impact on practice where frequent, unannounced mini-observations are used more regularly than traditional formal observations
  3. EE practices remain traditional in many/most systems
  4. Educator mindset while implementing EE often defaults to compliance
  5. Inherent dangers of coupling evaluation to compensation exist in those districts connecting the two
  6. Information needs to be better communicated/understood regarding EE requirements and flexibility for:
    1. Artifact collection
    2. Observation time / approach
    3. Document tool flexibility

Clarifying Legitimate Flexibilities for Implementing EE with More Impact

Since that meeting, I have been in ongoing communication with Katie Rainey, who serves as DPI’s Director of Educator Development and Support.  Below, Katie speaks to the minimum requirements of the Wisconsin Educator Effectiveness System that all districts, regardless of model, must meet and that DPI will monitor to ensure compliance with state statute (i.e., implementation of EE). Katie noted that implementation of minimum requirements alone will not produce changes in adult practices or student outcomes.  For a description of best practice, within the requirements of the System, as defined by DPI, refer to the state User Guides.  Providers of alternative EE models may have additional requirements of its users.  Because more than a third of WI districts use the CESA 6 EP model, Katie and I met with Cathy Clarksen (Director of the CESA 6 Center for Professional Practice and Growth) to ensure that this article is also accurate for the CESA 6 model.  What follows are four key questions raised from the EE Thought Leaders group (in red italics), Katie Rainey’s responses in turn (in blue font), and occasional comments about CESA 6 model applications where relevant, with questions 1 and 2 being most prominent.

1)    What are districts required to provide / do regarding EE?

The following is an excerpt from an upcoming Policy Guide that DPI is developing.  The Guide will identify the minimal requirements, as well as how DPI will monitor for those requirements.  For an understanding of what DPI intends for EE System implementation, including the five principles of a learning-centered evaluation process, please refer to the state User Guide.

Wisconsin Educator Effectiveness System Implementation Requirements :
  1. Districts provide orientation and training for educators and evaluators.
  2. Evaluators certify and districts provide ongoing monitoring of inter-rater agreement (calibration).
  3. Educators complete at least one Student/School Learning Objective (SLO) and one Professional Practice Goal (PPG) annually, as part of an annual Educator Effectiveness Plan (EEP) to improve performance. 
  4. Evaluators conduct required EEP conferences, including: Planning, Mid-Year, and End-of-Cycle conferences. 
  5. Evaluators conduct required observations of professional practice in the Summary Year and Supporting Years of the EE Cycle. 
Please see the Appendix at the end of this article for more detail regarding each of these five EE requirements.

2) What flexibility do districts have in the following areas of EE:

A) Artifact collection / component rating?
B) Observation approach?
C) Documentation tool?
D) SLO / PPG determination (whether connected or not to school- /district-wide goals)?
 

A. Artifact Collection:  Artifacts are simply a way to collect/document evidence.  There is no requirement from the state regarding the number or type of artifacts.  The requirement is regarding evidence—there must be evidence to provide a comprehensive view of educators’ practice and to support meaningful conversations about practice.  Educators must be involved in determining if the evidence provides a comprehensive view of their practice.  However, equivalent models may have additional requirements.  For example, the CESA 6 EP model does include a required documentation log for teacher-selected artifacts and reflection.  The number of required artifacts is determined locally within CESA 6 districts.  In all cases, the focus should be on quality.  DPI has consistently stated that the process for collecting or documenting the evidence should not get in the way of meaningful conversations—use a process that works for you locally. DPI would also recommend that educators utilize “high-leverage evidence sets” which address most components with limited numbers of artifacts (e.g., for a teacher:  a lesson plan, an observation of that lesson, data from the formative assessments used during/after the lesson, and a discussion regarding how that data will inform changes to instructional practice).

Component Rating:  The state System no longer requires evaluators to “score” practice.  In the past, evaluators had to collect evidence and then, based on the preponderance of evidence collected, determine which level or rating best described levels of practice.  However, the state heard feedback from the field and agreed this requirement did not support best practice.  Therefore, DPI removed this requirement and provided the following guidance instead.  (Note: The CESA 6 EP model does require the evaluator to use the preponderance of evidence from multiple evidence sources to determine a rating for each of the six standards during the summative year.  As such, the following description does not apply to districts using the EP model.)

Oftentimes, practice for specific indicators or look-fors within any one component varies across levels of performance.  By asking evaluators to pick the level which “best describes” practice meant that evaluators were not actually providing specific feedback based on observed practice.  DPI recommends that evaluators identify the level of practice for the indicators within the component, and provide feedback at this indicator level. This level of detail enables educators to create a clear plan that identifies where they are on a discreet skill within a component, where they want to go, and how they will get there, leveraging their strengths in the process.  This process supports richer discussions and is more likely to move practice than providing an “overall” score or level of practice that does not FULLY, accurately describe the educator’s practice.  

DPI understands that it is not practical to have this level of discussion across the entirety of any given rubric.  Thus, DPI recommends educators identify high-leverage areas for growth to focus coaching conversations and learning for the year.  With that being said, there must be an accurate/effective way of determining areas of focus.  Therefore, evaluators are required to collect evidence of educators’ practice across the entirety of the rubric to determine which areas to focus on.  By reviewing this evidence, an evaluator can effectively determine areas to focus coaching on for the year.  (Note: It is a requirement of all models to collect evidence across the entirety of their rubric/framework. See #5 in the Appendix at the end of this article.)

SLOs:  Districts can apply this same process to conversations about the SLO.  Specifically, educators can identify levels of practice across the six components of the state SLO rubric and engage in reflective conversations that support growth in each component, rather than an overall, holistic score. (Note: The CESA 6 EP model currently uses a slightly different SLO rubric that does not break out the six components.  Leaders of this model are considering moving to this type of “broken out” SLO rubric in the future.  In the meantime, CESA 6 recommends EP evaluators provide a holistic SLO rating.)

B.    Observation Approach:  DPI requires a minimum amount of time educators must be observed.  There is flexibility in the format and length of each individual observation.  For example, in a given year when approximately 90 minutes of total observation time (minimum) is required, a principal may use 5-6 mini-observations (15 minutes) in lieu of a traditional, full-length, announced observation (45 minutes), or some combination therein meeting the requirements.  Please refer to Appendix D, beginning on page 57, in the Teacher User Guide, or Appendix E, beginning on page 77, in the Principal User Guide for more information on the requirements regarding the number/type of observations.
 
C.    Documentation Tools:  Districts must use the provided (or approved alternative) rubrics/frameworks. But, DPI does not require districts to use any specific online tool or forms to document collection of evidence for those rubrics, or to document the EE process. DPI tools (i.e., Frontline) and forms (e.g., Beginning, Mid-Year, and End-of-Year forms) were created and are provided to support conversations, as well as the maintenance and organization of evidence to inform said conversations.  The focus should be on high-quality coaching conversations.  If the documentation tool or forms a district is using are negatively impacting best practice or quality conversations, DPI would recommend using a process that better meets the district’s needs.  (Note: EP districts must use Frontline to document their EE process and evidence.)
 
D.    Alignment of EEP Goals to Schoolwide Goals:  DPI would recommend (but does not require) that SLOs align to district/school goals in order to support systematic improvement efforts.  However, DPI would recommend the educator have flexibility to make this alignment in a meaningful and authentic way for his/her context.  DPI would not recommend dictating specific goals for all educators. 

 

Sharing Fruitful EE Practices for Your Consideration

With knowledge of the (often underutilized) flexibility described above and under the maxim that example is often the best teacher, we also leveraged our December Thought Leaders group to begin archiving flexible EE approaches that Wisconsin schools and districts have been implementing as means to inform leaders across our state of a range of potential legitimate actions they could take within EE implementation flexibility to raise prospects for “moving the needle” in adult practice and student success.   Please know that each of these listed approaches has been reviewed by DPI Director, Katie Rainey, and deemed in line with Wisconsin EE policy.*  Contact information for each approach is also provided so that you are better positioned to access the power of a statewide PLN. 

“Everything in the archive is allowable and an example of doing the work as intended, assuming schools/districts are meeting the other requirements of the system.  Most of these archive examples focus on one aspect of change.  They are all allowable if that change is within the larger system (meaning they are addressing all five requirements, as listed on p. 2 of this article).”  -Katie Rainey, Director, Educator Development and Support, Wisconsin DPI

 

3)  Where should principals go when they are seeking EE updates?

  • DPI would recommend that principals utilize the DPI User Guides.  Although they are lengthy, they are also the most comprehensive resource providing an updated narrative of DPI’s vision for Wisconsin’s EE System.  Additionally, CESA 6 EP school administrators and educators may also utilize the appropriate EP guidebooks at epsupport.cesa6.org.  DPI also has a Latest News section, the content of which is also available via a monthly newsletter.
  • We would also recommend principals consider taking a team to WOW (which will be renamed next year to the Leading for Learning series).  These events are designed to provide learning opportunities which focus on implementing EE meaningfully as part of best practice for leadership and instruction.  In addition, we provide updates regarding the EE System and address misconceptions or myths.  Information regarding next year’s series (including speakers, content, and registration) will be live mid-April.
  • I would also recommend attending an EE Exchange through your CESA or as a pre-conference session at the Leading for Learning Summit this June.  The EE Exchange is designed to provide district, school, and teacher leaders with unique insights and a rich opportunity for planning and growth in their Educator Effectiveness (EE) implementation.  Districts come together to review reports based on their local Wisconsin Educator Development, Support, and Retention (WEDSR) Survey data.  District teams learn about how to interpret the data from the team of researchers from Socially Responsible Evaluation in Education (SREed) at UW-Milwaukee, who developed the survey and reports.  District teams also learn about what climate and culture factors influence effective EE implementation and vice versa.
  • We will soon have a new report detailing findings from the Learning-Centered Study.  This study identified schools/districts that were implementing EE with a learning-centered evaluation approach aligned to the five principles.  Through observations, interviews, and focus groups within these locations, the report identifies specific examples of ways Wisconsin districts and schools have implemented EE meaningfully and have seen impact.  This study will inform the development of future DPI supports, but can also serve to inform local practice.  The study will be disseminated through the latest news and newsletter, noted above, in the coming months.
  • You can always email or call me at [email protected] or 608.267.9551.  We designed EE to be meaningful, authentic, and impactful.  If you feel, at any point, that something about EE is in opposition to that vision or best practice, let us know.  It means there is a misunderstanding regarding requirements we need to work to clarify, or it means there is something we haven’t yet considered. Either way, we need to know so we can address it.
  • Finally, we are also working closely with our partners (WASDA, AWSA, WCASS, and CESAs) to ensure there is a consistent understanding and messaging regarding EE (as well as with equity, mental health and SEL, and the WISEsuite) so that you receive the latest and most comprehensive information from the sources you already utilize in the most meaningful way possible. 

4)  What suggestions might you have for leaders who strive to be both effective and efficient with EE?

I would recommend a focus on utilizing the process as an ongoing continuous improvement cycle.  If educators are implementing high-quality PLC, PDSA, and/or teaming processes in which they regularly review student data and evidence of their own practice in order to inform instructional shifts in collaboration with peers, coaches, or evaluators, they are meeting our goals for EE.

Ideally, an educator reviews historical data to identify problematic trends.  If trends exist across years, that is an indication of practice, not student achievement.  To better understand a potential cause, educators should review data of their own practice, asking for example, “Are there practices I struggled to implement at high levels which may impact this particular student outcome?”  With this understanding, educators should identify a goal to address that problematic trend and include a focus on the leadership or instructional practices they will change.  This should not end the process.  The continuous improvement process should continue on a weekly (minimally) basis.  For example, as a teacher, I will plan instruction based on my goal.  During my instruction, I will formatively assess my students learning.  Using data from that assessment, coupled with support from peers and/or feedback from an observation of my practice, I will shift my instructional strategies.  This is a structured continuous improvement / PLC / PDSA process that incorporates evidence, reflection, and coaching on adult practices in addition to student data in a way that fits within authentic daily practice.  If done well, this meets requirements for EE.

This process is still going to be time-consuming, but it focuses the time on authentic daily instructional/leadership practices with regular feedback built in through the use of observations and coaching by peers, teams, coaches, or evaluators.  This is time well-spent that will impact adult practices and student outcomes.  I would NOT spend time on something that feels like compliance or that gets in the way of this meaningful work.  In the “archiving flexible EE approaches” resource that Joe Schroeder offers in the text box earlier in this article, you see several examples of schools and districts doing exactly this very well.

In Closing and Next Steps

We wish to extend a hearty thanks to Katie Rainey of DPI for her thoughtful replies and counsel in this important ongoing work -- and also to Cathy Clarksen of CESA 6 for her input into accuracy of this publication for CESA 6 EP model districts as well.  Please know that we will continue to share this information broadly in future webinars, academies, and conferences.  For example, we will be highlighting this information and examples at the WASDA Spring Conference in April and at the Leading for Learning Summit in June.  Moreover, we will be hosting a webinar on this topic on Tuesday, April 23 at 1 PM (with Katie Rainey and Cathy Clarksen joining me as guests to answer questions raised in the live discussion/Q&A segments of the broadcast.)  Please take the two-minute poll associated with this article, the results of which will help to enhance all of the related efforts just mentioned. 

Your AWSA team hopes that our ongoing efforts like those described above help more and more Wisconsin leaders to increasingly leverage both legitimate EE policy flexibility and innovation in the field to improve support, practice, and outcomes wherever they are rooted.  Please feel free to contact me ([email protected]) if you have further questions/concerns about EE implementation and/or if you have another flexible EE approach to share for potential inclusion within our table archive.  Wishing you all the best and most productive spring yet! 

_______________________________________

 

APPENDIX

Wisconsin Educator Effectiveness System Implementation Requirements

Wisconsin Educator Effectiveness System  Implementation Requirements :

#1 Districts provide orientation and training for educators and evaluators.  Districts must provide educators and evaluators with a comprehensive understanding of the Wisconsin EE System, as well as the district’s adopted EE model (e.g., DPI, CESA 6, equivalent).

Districts or schools must provide an annual orientation to the system for educators who are new to the district or completing a Summary Year.  As described in the EE System User Guides, orientation provides educators and evaluators a space to discuss a high-level overview of the state system and the district’s selected model, including “the evaluation criteria, the evaluation process, or the ongoing continuous improvement cycles informed by evidence of educator practice collected during observations, the use of evaluation results, and any remaining questions or concerns.” 

Educators and evaluators should also engage in EE System training that deepens their understanding of the System and improves staff capacity on an ongoing basis. EE System training should generate consistency in the use of the model.  Districts and schools may draw upon DPI guidance and training resources, along with other online and Cooperative Educational Service Agency (CESA)-provided professional development opportunities, when creating local training. 

#2 Evaluators must certify and districts must provide ongoing monitoring of inter-rater agreement (calibration).  Districts must create and implement a process (beyond initial orientation and system training) to ensure, and continuously improve, inter-rater agreement of all evaluators. This is necessary to ensure educators receive accurate feedback—evaluators must be able to accurately identify educators’ current level of practice, in order to coach them on where they can improve and how, specifically, to get there.

In the DPI model, evaluators of teachers must initially certify using a rigorous computer exam after completing comprehensive certification training that uses master-scored videos of classroom practice. Evaluators must calibrate at least once every semester (except semesters in which the evaluator initially certifies or recertifies). Evaluators must use the same online system to recertify every four years. 

#3 Educators complete at least one Student/School Learning Objective (SLO) and one Professional Practice Goal (PPG) annually, as part of an annual Educator Effectiveness Plan (EEP) to improve performance.  Educators develop an EEP annually and submit all EEPs from their current evaluation cycle to their evaluator in their Summary Year.  Educators base EEP goals on data and write them as specific, measurable, attainable, results-based, and time-bound (SMART) goals.  EEP goals help educators engage in a continuous process of analysis of student/school data and self-assessment of practice. 

#4 Evaluators conduct required EEP conferences, including: Planning, Mid-Year, and End-of-Cycle conferences.  Educators and their evaluators or peers meet to review EEP data, adjust instructional/leadership strategies as appropriate, and reflect on progress through required system conferences:
  • Planning Session:  Educators and evaluators (in Summary Years, minimally) or peers (in Supporting Years, minimally) meet to review proposed EEP goals in preparation for implementation.
  • Mid-Year Review:  Educators meet with evaluators or peers to review EEP progress and adjust strategies and goals as appropriate.
  • End-of-Cycle Conference:  Educators meet with evaluators or peers to assess the degree to which EEP goals were met and plan for the next EE Cycle. 
#5 Evaluators conduct required observations of professional practice in the Summary Year and Supporting Years of the EE Cycle.  Evaluators must conduct observations in a manner that provides sufficient evidence to conduct professional conversations and to assess the educator in all observable domains and related components of the professional practice framework.  For requirements regarding the number/type of observations and the flexibility available, please refer to Appendix D, beginning on page 57, in the Teacher User Guide, or Appendix E, beginning on page 77, in the Principal User Guide.

 

Read more at: 

Elementary Edition - Secondary Edition - District Level Edition