By Madelynn Coldiron
Staff writer
Three themes crop up when you talk to districts that have been field testing the state’s new educator evaluation system: time, money and change. But underlying those is a belief that the pluses outweigh the minuses.
“But if you look at the big picture of where we will be going with our Professional Growth and Effectiveness System, it’s got some real value in it,” said Corbin Independent Schools Superintendent Ed McNeel. “You’re measuring performance of all teachers and getting a value in areas that lead you to discussions, not necessarily evaluating you to have a negative impact, but how you can become better.”
PHOTO: Corbin Middle School Principal Ramona Davis, far left, observes teacher Jennifer Parsons as she leads a lesson. Photo by Susie Hart/Corbin Independent Schools
About 50 districts have been piloting the system on a very small scale, some of them, like Corbin, for two years. In the 2013-14 school year, every district will be field testing it, using a minimum of one-tenth of their schools – at least one school – with five to nine teachers in a school. It goes statewide, counting toward accountability scores, in 2014-15.
Some important details remain to be decided, such as how much student growth will be weighted in an educator’s performance and how often they are evaluated, but the framework is getting some good reviews.
Teacher reaction
“Our faculty really embraced the field testing because they don’t fear change,” said Augusta Independent Schools Superintendent Lisa McCane. “We knew it was coming down the road and we felt like it would be advantageous for us to go ahead and get in early.”
McCane said the district also wanted to be able to provide the state education department with feedback to help shape the PGES. The district evaluated eight teachers at all grade levels using the new system and the superintendent said she hasn’t found any differences in the way it works among those levels.
In Lawrence County Schools, a two-year pilot site, Superintendent Mike Armstrong called the system “a really good fit” for where the district is headed. He said teachers in the pilot have realized they “have to have command of all the expectations,” and make sure they document how they are meeting the standards for effective teaching.
Corbin Middle School Principal Ramona Davis, who recently was promoted to assistant superintendent, said teachers are apprehensive about the weighting of student growth over the course of a year in the new system. There also is some concern about using student surveys, she said.
Davis said this past year was a “learning experience” with a good outcome as she piloted the system with two teachers.
“I saw a vast improvement in my teachers … because now we’re getting into good conversation about teaching and learning,” she said.
One of those teachers, Jennifer Parsons, said the PGES has a lot of positives and gives teachers more of a voice in the process.
“It gives you a way to open up a conversation with your administrators when you’re reflecting on your teaching practices throughout the year, not just on one observation but several times throughout the year,” she said. “That’s a big advantage.”
Concerns
Such a major change calls for more professional development, but the state spigot for that has been dry. “It really would be nice to have a lot more training for our staff and we don’t want to ask them to do several days of training and there’s no funding to do that with,” McNeel said.
McCane said her district is using a “train the trainers” approach as it usually does to keep the budget down, relying on in-house expertise.
The time it takes for principals to become certified to conduct the evaluations also is a hurdle – the rigorous, online training modules eat up many hours and they must pass a two-part test to become certified. The evaluation process itself takes more time, said Lara Hill, principal of Caldwell County Primary School, who fully piloted the system with one teacher this year.
“I’m looking at doing this process with 16 teachers (per year), potentially. That scares me,” she said.
At George Rogers Clark, one of the state’s larger high schools, which did not participate in the pilot this year, Principal David Bolen said he evaluates an average of 60 teachers per year at his Clark County school.
Daviess County is using an all-hands-on-deck approach to help principals. Assistant principals and district administrators are getting certified to conduct evaluations, while staff developers and special education consultants are training to become peer observers, said Julie T. Clark, assistant superintendent for teaching and learning.
“Our emphasis in implementing this new evaluation system is to build capacity among both our school and administrative teams,” she said.
The PGES is being implemented at a time of great change for schools, with a new accountability system, new Common Core State Standards and more. That has been the most difficult aspect for educators, said Lawrence County’s Armstrong.
“When I try to listen to folks and the feedback I get, it’s not about people’s refusal to take things on. It’s just a matter of the climate right now, the culture, the money. It’s a hard time,” he said.
Hill said parts of the system, such as the goal-setting are “really good,” but added, “It’s not that the things that are being brought to us are not good – they are good things – but they’re brought to us so quickly we can’t swallow one before we’re swallowing another.”
Getting ready
McNeel said schools should waste no time getting ready for the new system. “All districts need to be aware this is a huge undertaking as we unfold all this and planning needs to be taking place a.s.a.p. to implement it,” he said. It’s also important to talk about the process with all teachers, he said, not just those who are part of the field testing.
Clark County’s Bolen said he and his assistant, both certified evaluators, have already identified 10 teachers to follow next year, choosing a mix of new and experienced teachers, as well as some who are skeptical in the hopes that they will eventually become advocates for the system.
“We’ve been meeting with those 10 teachers, doing some legwork,” he said. “I think that will help pave the way for it.”
Corbin’s Davis said the district is working on a plan for training the entire staff throughout the year next year. “I think you can overwhelm your teachers if you try to do it all in one day or even two days,” she said. “I think it needs to be broken down throughout the year, going over each piece, so there are no surprises.”
Tweaks possible, other final decisions coming
The statewide field testing of the new teacher Professional Growth and Evaluation System in the 2013-14 school year likely will look about the same as the process that was tested in 50-some districts in the 2012-13 school year.
Felicia Cumings Smith, associate commissioner of the state education department’s Office of Next Generation Learners, said the overarching components will stay the same. The state education department is looking at data gathered from the pilot districts this year and will likely decide by the middle of June whether any tweaks are in order, she said. Any changes will be communicated to districts in July or August before the statewide field test begins.
“If something changes it will be as a result of the feedback and in turn it will represent deleting a process or form rather than adding to the system,” she said in an email interview.
A report on the field testing of the PGES for principals also will be ready in mid to late summer, Smith said.
Some changes already were made in the system after the early piloting in 2011-12, responding to the need to give principals more time to become certified evaluators and receive more training. KDE is still researching the amount of time it takes principals to conduct teacher observations, Smith said, but added, “Ultimately, district and school leaders will need to take a hard look at how principals and other school personnel are using their time.”
Research also is continuing into how the evaluation elements will be weighted for teacher performance level ratings, which are categorized as ineffective, developing, accomplished and exemplary. “A part of the field testing has been to gather the necessary data and feedback from field test participants to inform the recommendations for the Commissioner’s Teacher Effectiveness Steering Committee to consider to present to the KBE,” Smith said.
Information from the Measuring Effective Teachers research project, funded nationally by the Bill and Melinda Gates Foundation, also will be used in proposing percentages for measuring teacher effectiveness, along with other national and state research. The approach, said Smith, is “deliberative,” but she said she suspects regulations for the rating system will be discussed and developed before the end of 2013.