×

In New Hampshire, Upper Valley Schools Struggling With State Testing



Valley News Staff Writer
Sunday, November 05, 2017

New Hampshire students who took the Smarter Balanced Assessment Consortium standardized tests in the 2016-17 academic year demonstrated a decline in proficiency in virtually every grade and both subjects — results that have state education officials analyzing the drop in scores to better understand its significance and local officials questioning the usefulness of the test itself.

The Smarter Balanced test is meant to measure reading and math proficiency for third- through eighth-graders, with 11th-graders taking the SAT. The results announced by the New Hampshire Department of Education late last month showed that scores, with the exception of slight increases in seventh-grade reading and 11th-grade math, dipped across the board.

“We are obviously concerned about the decline in student performance and will be working closely with schools to understand the underlying drivers,” state Education Commissioner Frank Edelblut said in an Oct. 23 news release. He also noted that reading proficiency declined in every state that administers the Smarter Balanced test, with the exception of California. Reading and math scores also declined slightly in Vermont, where students in grades three through eight and grade 11 take the Smarter Balanced test.

20171101-vn-NHSBAC-sb-jpg-(1).jpg

Of course, averages don’t tell the whole story. The results from individual schools differ widely throughout the Upper Valley, raising the question of what educators will, and should, do with this data.

At Claremont Middle School, the percentage of students who scored proficient or above declined at rates similar to the statewide trend — usually by 1 to 4 percentage points — though unlike the state as a whole, Claremont Middle School scores declined across the board.

Assistant Superintendent Cory LeClair did not express concern over these drops. Instead, she called into question the validity of assessments like the Smarter Balanced test, in part because it measures for proficiency, rather than improvement over time.

She also suggested that students and educators may still be adjusting to the Smarter Balanced test, which is only in its third year. The Smarter Balanced tests are administered in part to allow states to fulfill testing requirements of the Every Student Succeeds Act, the federal education law that replaced the No Child Left Behind Act.

Starting next year, New Hampshire students will be taking a new standardized test, called the New Hampshire Statewide Assessment System, or NH SAS, the Department of Education announced in a Sept. 13 news release, because the state is transitioning to a new assessment vendor. The NH SAS will replace both the Smarter Balanced test and the New England Common Assessment Program, or NECAP, which tests students in grades five, eight and 11 on science.

Heather Gage, director of the Division of Educational Improvement at the New Hampshire Department of Education, said one of the advantages of the NH SAS is that it can be customized by the state, unlike the Smarter Balanced test, which is part of a consortium. And since the testing platform and interface will remain the same, students and educators won’t have to adjust to a new software program all over again, she added.

“(I)t’ll be very similar to the former assessment, and therefore very familiar to the school district,” Gage said in a phone interview. “It’ll still be rigorous, it’ll test for the same material and it’ll still be adaptive,” meaning that the test gets harder if a student answers correctly, and easier if the student’s answer is wrong.

One difference is that reading and math test times will be reduced, which allows the Department of Education to add a writing component that will be “AI (Artificial Intelligence) scored,” according to the news release.

Because of what LeClair perceives as the changing landscape of state assessment models in recent years, “proficiency” has become something of a moving target, she said. “It’s really challenging to take stock in something that continues to shift. And so, as the target continues to shift, we have to rely on more internal growth measures.”

Among these measures in Claremont is the computerized testing program i-Ready. Students take three i-Ready assessments per year, with the idea being that more frequent check-ins help to “determine our students’ progress on an individual basis, and adjust our instruction accordingly,” LeClair said. “We don’t put as much thought into the state assessment when that assessment continues to change, and continues to reflect only end results without looking at the starting point.”

She said she doesn’t know what the school district will use its Smarter Balanced data for.

As tends to be the case, achievement gaps between higher- and lower-income communities were significant: At Hanover’s Ray School, for example, 92 percent of fourth-graders scored proficient or above in reading, and 91 percent in math. Scores for low-income students — defined by whether they qualify for free or reduced lunch — are not available because fewer than 11 of the class’ 85 students from this demographic, and the state does not report the results of subgroups that small due to privacy concerns.

Meanwhile, in the Mascoma Valley Regional School District, which also has 85 fourth-graders distributed between its Canaan and Enfield elementary schools, just over half of them, 55 percent, were proficient in reading. Within those 85 students, 30 qualify for free or reduced lunch.

Scores were even lower — and the percentage of students in poverty even higher — at Richards Elementary School in Newport. Only 19 percent of fourth-graders passed in reading, and 11 percent in math, some of the lowest Smarter Balanced scores in the New Hampshire side of the Upper Valley.

 

Richards Elementary Co-Principal Phil Banios was disappointed to see these scores, but said that “one thing these scores don’t mean is that our kids and our teachers aren’t as good as other places.” He emphasized that while he does not take the results lightly, it’s also important to keep them in perspective, “given the demographic we’re dealing with,” he said. “This isn’t Hanover or Grantham.” Nearly half of the fourth-grade class at Richards Elementary — 27 out of 58 students — live in poverty.

Because the student body in Richards Elementary is “economically diverse,” Banios said, their needs are diverse, too. If these needs aren’t being met in, or outside, the classroom due to lack of resources, “many, unfortunately, don’t put forth the same level of effort (on standardized tests) as students in other places” where academic success is more of a given.

Low-income students also are more likely to have learning disabilities, or at the very least be what Banios called “non-traditional learners,” whose aptitude in hands-on learning situations may not translate to tests like Smarter Balanced, which “narrows the ability for students to show what they can do,” he said.

He also pointed out that test scores can affect the economic health of a community: Families thinking about moving to the area may take schools’ standardized test performance into consideration, and school districts with high test scores tend to have higher property values. Achievement gaps between high-performing districts and low-performing districts can therefore reinforce divisions between wealthy and poorer communities, making it less likely that low-income kids will have access to the same educational opportunities as their more affluent counterparts.

“We’re not making excuses, and we’re not happy with these scores as we see them,” Banios said. “We’re going to monitor closely, and expect improvement. But I do not want people to read too much into these scores, because there are many factors that go into them that can’t be measured on the test.”

Soon, though, Richards Elementary will have dramatically fewer of what he called “skill and drill” tests to worry about. The school is moving to a new accountability system in New Hampshire called Performance Assessment of Competency Education, or PACE, which ideally gives students flexibility in demonstrating what they know and can do. This so-called “competency-based education” is meant to ensure that students fully understand a topic before moving on, so that no child gets left behind, so to speak.

Whereas tests like Smarter Balanced are developed by testing companies to meet federal accountability standards, PACE assessments are developed in-district, which was a major draw for Banios. He thinks becoming a PACE district will give students more opportunities to succeed, and although the program was only rolled out in 2014, and very gradually, early analysis shows some promise that competency-based assessments may better reflect students’ knowledge: A researcher at the University of New Hampshire found that in the second year of PACE’s implementation, students outperformed their non-PACE peers in eighth-grade math, though other results haven’t been released yet, the ConcordMonitor reported.

The Newport district is considered a “Tier 1” district, meaning it’s ready to begin using its locally designed PACE assessments, whereas “Tier 2” districts are still developing their assessments. The PACE test is not a substitute for the Smarter Balanced test or the NH SAS: Students in PACE districts must still take these tests once in elementary school and once in middle school, while still taking the SAT in high school. In all other years, PACE districts administer their own assessments, which still must meet state testing standards.

Smarter Balanced scores from schools like Richards Elementary highlight the ongoing problem of economic disparity in the Granite State. But some schools had more erratic results, raising the question of how to interpret steep rises and sharp declines.

In the 2015-16 testing year at the North Charlestown Community School, for example, 75 percent of fifth-graders scored proficient or above in reading. This year, only 38 percent did — a drop of 37 percentage points.

But last year, when this year’s fifth-graders were fourth-graders, only 24 percent of them passed in reading. That would make this year’s 38 percent proficiency rate an improvement, not a red flag. How does one read these contradictory trends?

“Yeah, that’s a tough one,” Fall Mountain School District Superintendent Lori Landry said in a phone interview. She plans on examining the data further, but said the district is moving toward smaller-scale, competency-based assessment methods. Like the Claremont School District, Fall Mountain tests students three times a year to keep a close eye on individual student performance, and like the Newport School District, it’s transitioning to a PACE-based assessment system that emphasizes authentic learning, rather than, as Banios put it, “skill and drill.” Fall Mountain is still considered a Tier 2 district.

While tracking year-to-year performance within the same grade is generally preferred over tracking the same cohort over time, “I would always encourage people to look at cohort groups,” Saundra MacDonald, the administrator of the Bureau of Instructional Support and Student Assessment at the New Hampshire Department of Education, said in an email exchange last week. “It is the only way to do an apples to apples comparison.”

Several officials at the state Department of Education said they weren’t sure how, exactly, they would put these comparisons to use. It’s still early — data from PACE schools are not in yet, but “once we have received and synthesized all the data, we will begin analysis and planning,” MacDonald said.

In the meantime, school administrators are left to interpret student performance however they will, she said in a phone interview. But some administrators, like LeClair, are interpreting the data less as a measure of student ability, and more as a reflection of the test’s limitations.

“If the data across the whole state had a dip, and in other states too, I’m just wondering,” LeClair said. “If everybody declined, is that more indicative of everybody who took the assessment, everybody who taught for the assessment — or the assessment itself?”

And so measuring the test’s proficiency, much like measuring student proficiency, may not be as straightforward as it appears.

 

EmmaJean Holley can be reached at ejholley@vnews.com or 603-727-3216.