Leaders of Performance: Planning with the End in Mind

[Note to readers: This column is part of an ongoing series for Iowa ASCD’s The Source e-newsletter.]

Leaders of Performance: Planning with the End in Mind

What does it mean to be a curriculum lead?  This is the sixth column in a series for Iowa administrators, teacher leaders and anyone else interested in enhancing curriculum leadership.  Over the past year or so, we’ve discussed the work of curriculum, instruction, and assessment; data analysis, processes, professional development, and relationship building.  This week, we’ll be taking a closer look at what it means to be a leader of performance.  Future columns will consider the remaining two facets of curriculum leadership: operations, and change.

According to the functions of our work, curriculum leads, “…model, expect, monitor, and evaluate continuous learning of all students and staff members.”  Monitoring and evaluation matters! One way of thinking about this is that curriculum leaders ought to be thinking about what the desired outcomes are when taking on a new program or initiative.  Any educator who has been around a while knows our profession is pretty good at trying new things and/or renaming old ones.  First, it was “instructional decision making,” then it was renamed “response to intervention” and now we call it “multi-tiered system of supports.”  Last year, our district focus may have been project-based learning and this year it is creating profiles of a graduate.  Twenty years ago, we were writing standards and benchmarks from scratch (or let’s be honest, borrowing them from the district next door!), and now we’re digging into the latest iteration of the state’s content standards.  I don’t mean to sound cynical in making these observations, but instead to suggest how much time is allocated towards improving schools.  We can and should be working towards a culture of continuous improvement.  This sometimes means dropping old ideas that do not work and/or trying out new ones.  As curriculum leaders, our role is to model and evaluate these changes.  As such, the purpose of this column is to suggest several practical ways leaders can evaluate and monitor change rather than starting and stopping them without attention towards fidelity of implementation.    

One way to think about monitoring and evaluation is to make connections with quality curriculum, instruction and assessment practices at the classroom level.  In the Understanding by Design framework, McTighe and Wiggins suggest “effective curriculum is planned backward from long-term, desired results through a three-stage design process (Desired Results, Evidence, and Learning Plan).”  In other words, classroom teachers should consider where they want students to be at the end of a unit, course or reporting period and plan backwards.  The same concept can and should apply to any major building or district wide curriculum change.  Curriculum leaders in tune with the performance function should first consider what success could looks like at the end of the school year.  Unfortunately, I was guilty on more than one occasion of launching a new idea without being able to articulate what “success” would look like as the year progresses. 

Instructional coaches and others in curriculum leaders roles are in an excellent position to ask each other questions such as:

  • “If we implemented this change with fidelity, what would our teachers be doing differently in May when compared to September?” and
  • “If we excelled at this change, what would our students be doing differently in May when compared to a year ago?” 

Too often, we’re guilty of providing a wonderful splash event kicking off the school year which encourages all staff to think more deeply about grading practices, social emotional learning or trauma-sensitive schools.  We might even follow it up by seeking feedback from teachers on their perceptions of the August workshop and using this information to plan a follow-up in October.  An alternative approach might be to cast a tangible vision of what the staff and students would be doing differently six months later, and regularly providing staff with feedback along a continuum as they work towards this ideal state. 

One such tool used to monitor and evaluate an educational change is an innovation configuration map.  In case this is a new concept to you, “An IC Map specifies behaviors and expectations related to implementing a curriculum, intervention, or evidence-based practice and categorizes these behaviors on a spectrum from ideal to less than ideal” (REL).  An excerpt from one of Central Rivers AEA’s innovation configuration maps on number talks is included below.

Note how a teacher could self-assess to determine the level in which he/she has an environment conducive to number talks.  Additional components of the unabridged innovation configuration map for math talks include teacher role in student discourse; teacher questioning; teacher notation; academic language; and instructional time, to name a few.  Through the use of this tool, a teacher sees that he/she should be working towards a new type of seating, using student hand signals, implementing specific questioning techniques, and utilizing math problems with increased rigor to make connections with previous learning.  Although the August workshop on math talks may provide an overview of what math talks are and are not, an innovation configuration map can give all educators a description of the desired state.  Similarly, curriculum leaders may ask teachers to self-assess their current progress along the IC map.  This information could be used to plan next steps in professional learning and to celebrate interim progress.  At the end of one or more years (assuming countless hours of professional learning, coaching and support, of course!), elementary teachers may be expected to be fully implementing math talks in their classrooms.  The innovation configuration map provides a visual of what this change (math talks) looks like in the end, when implemented with fidelity. 

Our role as curriculum leaders is to monitor and evaluate the changes initiated by the department of education or local administration.  We owe it to our colleagues to show them what success looks like early and often.  An innovation configuration map is one possible tool to assist in this quest towards comprehensively monitoring and evaluating professional development.  Dr. Tom Guskey’s five level professional development evaluation framework suggests educators’ use of new knowledge and skills takes time and as such, requires ongoing evaluation:

“…Did the new knowledge and skills that participants learned make a difference in their professional practice? The key to gathering relevant information at this level rests in specifying clear indicators of both the degree and the quality of implementation. Unlike Levels 1 and 2, this information cannot be gathered at the end of a professional development session. Enough time must pass to allow participants to adapt the new ideas and practices to their settings. Because implementation is often a gradual and uneven process, you may also need to measure progress at several time intervals.”

Guskey, 2002

Implementing a new instructional practice takes time.  I suggest that curriculum leaders move away from “spray and pray” professional learning in which we hope scattering snippets of training and support will somehow find their way into all classroom.  If blended learning is the professional learning focus of the year (or the next few years), curriculum leaders should begin with the end in mind, share the success criteria, and provide regular support to help everyone achieve the intended results.   If collaborative teams within a professional learning philosophy are the improvement focus in 2019-20, all teachers should know what an effective teaming environment looks like, does not look like, as well as where they’re at along the implementation continuum. 

In closing, curriculum leaders who value performance know that monitoring and evaluation matters

Resources to further learning as a leader of performance:

  • Understanding by Design Guide Set by Wiggins and McTighe (2014, ASCD)
  • Evaluating Professional Development by Guskey (1999, Corwin)

Leave a Reply

Scroll to Top