Featured, Guest Blogs
Leave a comment

Jason Messer: Fostering Effective Education Transformation

transformation

by Jason Messer
Superintendent, Manteca Unified School District

In 2013 The Manteca Unified School Board allocated 30 million dollars (as a foundation) and directed myself, Superintendent Jason Messer, to take the District digital by the end of the 2014-2015 school year.  In the past 18 months we have successfully identified the tools, support and resources necessary to meet the School Boards expectations and provide every teacher (1,200+) with a Surface Pro 3 and every student in grade K-12 (23,500+) with a purpose built computer, Microsoft 8.1 two-in-one tablet.  We have also built out the wireless infrastructure necessary to support more than 25,000 computers accessing our network and the internet at the same time.  In addition each of our teachers has received more than 18 hours of staff development time and instructional resources. As well classroom management software has been deployed as a key component.  Please visit www.digitalschoolstoday.netand/or www.mantecausd.net to learn more about our story.  In order to successfully take MUSD digital, I used the Microsoft Transformation Framework to guide our effort. I strongly suggest that Educational Leaders review this body of research. 

The one aspect of our going digital with which I have struggled most has been the setting of the vision and then the creation of our tracking and accountability program.  Quite simply I feel the pressure to not set the bar too low or too high.  I have continued to remain vague in my response to those who asked in hopes that with more time the vision would become clearer.  Tom Clark’s white paper on Quality Assurance was like a “kick in the pants” for me and we have now created a committee that is focused on outcomes-based planning and evaluation.  The committee is expanding to include more stakeholders including parents, students and community members and the committee has begun surveying multiple groups including teachers, students and parents.  The committee will meet on a regular basis and continue to define and redefine our vision, successes and failures.

While the pressure for me as Superintendent to define success and “set the bar” has not waned, I believe this response from Dr. Clark gives great perspective. “School leaders need to let everyone in the stakeholder community have their say, but in the end must make decisions based upon what they believe is good for students, the local community, and the public/society overall. Over the long term, the public/society will judge the decisions, through the political process via mechanisms such as school boards and tax levies.”  Following is the full written interaction between Dr. Tom Clark and myself.  It is a long and in some ways difficult conversation to read, but this is difficult work, and therefore should not be entered into lightly or without prior research and devotion to success.

Jason Messer:  (Q1). The role of leader is to facilitate the creation of the vision, define the desired results, and plan the project or initiative.  This process includes researching best practices and stakeholder expectations by both the leader and the stakeholders.  The “Planning Monitoring and Evaluation” cycle encircles this process in order to support the success of the initiative.  What specific detailed steps and at what specific point in the planning process would you suggest a detailed monitoring and evaluation plan should be developed and why do you suggest such a timeline?

Tom Clark: Thanks Jason for asking some excellent questions and helping us get the word out about the Microsoft in Education Transformation Network white paper series!  I should say at the outset that my white paper on Quality Assurance is primarily meant to provide guidance to education transformation leaders in state and national systems. One of the key points I make in this white paper is that these leaders should listen to what local education leaders say about how things really work — something that we as program evaluators hear about all the time — rather than designing inflexible goals and outcomes and methods for meeting them.

In response to your first question, I’d suggest integrating an outcomes-based planning mindset into the district leadership process itself. Then monitoring and evaluation is easier to do, and isn’t just an “add-on” later on. Outcomes based planning begins with defining your desired results – the main outcomes (changes in people or organizations) that you’re seeking to achieve, and benchmarks along the way. It also impacts vision — if your vision has desired results that can’t be clearly defined, that’s a problem. Desired results may include interim outcomes along the way, such as results on local assessments that school leaders administer to predict student success on state tests later in the school year. 

Operational outcomes related to building the educational program are also important interim outcomes to track. Using an outcomes-based planning approach will help ensure you are gathering the data you want to monitor over time, and will facilitate reviews of progress and tweaking of desired results through periodic evaluation activities later on. Monitoring of key metrics is continual, with reports typically on a monthly or quarterly basis. Evaluation and revision to goals and strategies is typically annual. When leaders don’t clearly define their desired results and how they will achieve them, while refining approaches over time, they may encounter surprises along the way and may be unable to demonstrate their success to stakeholders. 

Outcomes-based evaluation can help you determine key metrics to track during the planning phase and document impact for accountability purposes. But developmental evaluation (Patton, 2010) can be a value-added  during the implementation phase, providing real-time feedback to inform the ongoing innovation that goes beyond the periodic improvement feedback provided by traditional formative evaluation. Developmental evaluators (who may be internal) essentially function as a critical friend to the development team. They help develop new measures and monitoring processes as interim goals evolve during development, and document learnings as they arise.

Jason Messer: (Q2). You suggest that the first steps in building an effective educational transformative program is defining desired results and that this should precede planning the program and the monitoring and evaluation activities.  While I can clearly understand how desired results should be part of the vision, how do you suggest we identify desired results with stakeholders prior to engaging them in the planning/vision development process?

Tom Clark: Good question, Jason. Stakeholder support is very critical in any change process. I would suggest that the process of defining desired results should be integrated into outcomes-based planning. System planners need to involve school leaders and community representatives in planning an education transformation initiative, be prepared to modify the initiative based on stakeholder feedback, and give flexibility to local leaders to modify the approach to achieve desired results. 

In the second white paper in the series that you cite in a later question, Enabling Transformation with Strategic Planning, Ben Jensen describes a large scale transformation initiative in Hong Kong. A 20-month process was undertaken with stakeholders to finalize the education change strategy. It was then carried out with stakeholder buy-in, and a strong evaluation and accountability focus on achieving specific changes in teaching and learning practice. Some aspects such as in-school support were negotiated with local schools.  Hong Kong has a centralized educational system that makes it similar to a very large school district. The  initiative was very successful and probably helped account for Hong Kong’s rising scores on international tests. Of course, decentralized systems may need to  approach education change a bit differently.

In a school district, the leadership team may propose a vision for transformation, then test the waters with a variety of stakeholders.  It’s important to consider beforehand whether the vision has desired results that can be clearly defined, and to make those results part of the business case for the vision. A formal stakeholder involvement process and public relations effort are needed. The visioning process should include development of both high level vision and goals, and performance indicators and targets. These can then be shared with key stakeholders. Project RED, a research study on 1:1 programs, developed a planning chart which can be valuable to those developing related initiatives. 

Jason Messer: (Q3). You refer to the fact that there are several challenges to effective monitoring and evaluation, not the least of which is the “ongoing culture clash between the evaluation and political worlds, and the limits on evaluator independence within governmental systems, while citing the ongoing role of evaluation as a trustworthy, dependable tool for preserving public accountability.” – Chelimsky 2008.  When I juxtapose this statement against the statement put forth under the section entitled “Enabling Transformation with Strategic Planning, Organizational Capacity, and Sustainability,” I question where our limited resources be devoted to ensure true transformation in education?  Do we focus our resources and efforts on short term results demonstrated through politically backed assessments such as SBAC, or focus more on the shift in instructional practices carried out by our teachers?  What do you think and why?

Tom Clark: State adoption of the Common Core standards followed quickly by the Common Core assessments is creating a lot of stress for schools, impacting teacher morale and leading to short-term achievement drops. NCLB annual testing mandates compress the adoption timeframe. One of the problems with system-wide initiatives is that a model like Common Core may be rolled out quickly without effective, sustained long-term professional development. Shifts in instructional practice take a great deal of time and effort to accomplish. They are essential to making a transition to new ways of teaching that focus on the Common Core. In a long-term evaluation, we might hope to see shifts in three years, not one year. I recently co-authored a related white paper on innovations in Common Core Math teaching and their relation to education reform efforts. 

Here I would reiterate that some organizations that adopt outcome-based planning may design inflexible outcomes and a theory of action for how those outcomes might be achieved which is unrealistic. You referred to Ben Jensen’s white paper which note that many initiatives  focus on “policy lever” desired outcomes, but fail to provide clear, realistic change strategies to achieve the outcomes. I totally agree that this is the case. I would argue for outcomes-based planning and evaluation that is participatory rather than top-down, and which includes generation of clear change strategies that are realistic to implement at the local level.

Evaluators often find themselves inside the system, responsible in part for public accountability, but only able to play an influencing role with decision makers.  The clash is also between evaluators. Some evaluators would see the outcomes-based approach I advocate or the behavioral change approach that Jensen advocates as being in the positivist camp – reality is real and objective, we can test and improve a program. They would take a constructivist approach instead, seeking primarily to understand how the program actually works on the ground, unanticipated challenges and benefits, and the experiences  of participants including marginalized groups. 

It is important to integrate elements of this constructivist approach into planning and evaluation. The most valuable way to do this in an innovative change program may be to have internal evaluators consciously take on a developmental evaluation mindset during implementation, going beyond the periodic feedback role typical in formative evaluation. The evaluation then becomes a valued part of the change process.

Jason Messer: (Q4). Part of the culture clash of evaluation that you note is the difference in the approaches of methods-,use- and value-focused evaluation theories outlined in table 1.1.  I focused in on row three “who primarily judges the program benefits.” A key point you make is that an evaluation of holistic school reform requires elements of all three approaches listed in the table.  If we believe it is key to address the needs of those who “primarily judge” the success of the reform effort, and If ultimately we believe in public education the judges should be the “public/society”, how would we evaluate in a manner that gives the community the strongest voice without alienating the other two groups of judges?

Tom Clark: It is important to involve stakeholders throughout the process. However, there will always be stakeholders who are advocates for certain issues or positions regardless of how things are going. In terms of alienating the other types of judges — experienced evaluators look at multiple sources of evidence about the same thing, and try not to be unduly influenced by individual points of view, unless in the aggregate, those views suggest a pattern or theme that needs attention. School leaders need to let everyone in the stakeholder community have their say, but in the end must make decisions based upon what they believe is good for students, the local community, and the public/society overall. Over the long term, the  public/society will judge the decisions, through the political process via mechanisms such as school boards and tax levies. 

Jason Messer: (Q5). As I started off with a question regarding when and how we engage in planning for monitoring and evaluation in the transformation process, I wish to come full circle and acknowledge that the key indicators of conditions that foster educational transformation are very enlightening as to the specific data we may want to begin collecting during the monitoring stage of our digital initiative.  Has there been any work to establish valid and reliable survey questions or observational [methods]?

Tom Clark: Just as monitoring and evaluation won’t be an “add-on” if an outcomes-based planning approach is used, technology integration won’t be an “add-on” to teaching and learning if true integration is achieved. There’s been a lot of recent research on Technological Pedagogical Content Knowledge (TPACK),  proposed by Mishra and Koehler (2006) as a way to truly integrate technology with pedagogy and learning content. For example, Archambault and Crippen (2009) created TPACK tools to use with online teachers. 

The specific initiatives that I cited have led to indicators of education transformation, but have not yet led to specific monitoring and evaluation tools that are publicly available. However, the New Pedagogies for Deep Learning project (Fullan and Langworthy, 2013)  is using two tools freely available from the Innovative Teaching and Learning (ITL) Research Project sponsored by Microsoft Partners in Learning. The tools include a Learning Activity Rubric and a Student Work Rubric, which are being used to assess deep learning by students in about 1000 project schools worldwide.

There are many  other survey and observation tools out there.  The Institute for the Integration of Technology into Teaching and Learning at University of North Texas developed and validated several tools in the early 2000s that are freely available. The Florida Center for Instructional Technology offers online surveys on an annual license basis and also has has a lesson review tool.  Hofer, Grandgenett, Harris, & Swan (2011) sought to validate a TPACK-related observation instrument. They make it and a lesson review tool freely available. 

We now know some of the key indicators of conditions that foster educational transformation. I hope that these ongoing efforts lead to methods and tools that help leaders effectively plan, monitor and evaluation education transformation initiatives. 

I want to thank all of the authors of the Microsoft Transformation Framework for their research and insight that I found so helpful in ensuring success in the Manteca Unified School District and I want to specifically thank Dr. Tom Clark for his time and thoughtful responses to my questions.

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>