On November 22, the Massachusetts Department of Elementary and Secondary Education (DESE) will release data on the first year of the State's new Educator Evaluation system. The data will include aggregated schoolwide and district results, not individual educator ratings. As a Race to the Top district, Cambridge Public Schools implemented this new system for the first time during the 2012/13 school year.
The new system focuses on continuous learning by teachers and administrators through a five-step evaluation cycle including self-assessment and goal-setting; by State law, soon it will begin to incorporate measures of student learning for both teachers and administrators. We now are working with the CEA to establish District-Determined Measures to make sure that MCAS results are not the sole indicator on which an educator's effectiveness is assessed. In time, again per State law, student and staff feedback on educator performance will be utilized as well. Principals and other administrators are also being evaluated under the new system.
Because the new system is complex (for example, all educators are evaluated on 4 standards and at least 32 indicators), CPS invested significant time in training our evaluators on the new standards, as well as in the use of new technologies to support communication between administrators and teachers. Information sessions were conducted for teachers during the opening-of-school professional development days in August 2012, followed by in-school workshops led by principals and department heads.
As part of the roll-out of the new system, the DESE mandated that Cambridge and other districts evaluate 50% of all educators in 2012/13; CPS exceeded this goal with 65% of all educators being evaluated in the first year of implementation. Given the complexities and time-intensive nature of the new system, we are not surprised to see more variability among schools in the evaluation results than we expect to see in the coming years. While some might be tempted to perceive the variations in data as suggestive of relative school quality, I urge you instead to view the report as evidence of a work in progress.
Indeed, this is what all districts appear to have experienced. We are studying the data to understand the factors underlying differences in results by rating category across schools and learn from them. It is important to remember that these results represent our initial effort in a new evaluation process that requires better calibration and coherence. Just as an orchestra needs many rehearsals before it can perform a symphony fully in sync, educators across the state require time to develop a shared understanding of how best to use the new system.
The new evaluation system holds promise for our district as we work with our educators to promote improved achievement for all of our students. We will continue to support our administrators as they develop and hone their evaluation skills with this new system, and we will join with our teachers to build a professional environment that holds high expectations for adults and children alike.
Jeffrey M. Young
Superintendent of Schools