From the Editor

Assessment Update

Education Week reports continuing erosion in the number of states planning to administer the assessments created by the Smarter Balanced Assessment Consortium and the Partnership for Assessment of Readiness for College and Careers. The SBAC and PARCC assessments were specifically designed to measure student progress on mastering the Common Core State standards. While the Common Core Standards—renamed, revised, and in some locations still reviled— have largely survived, the assessments have not done as well. In some states pulling out of the assessment consortia was the price of seeing the Common Core survive; in most states the assessments also suffered in the general backlash against “too much testing.”

In fact the assessment picture seems to have stabilized. Ed Week reports that for the 2016-17 school year, 20 states and the District of Columbia will administer PARCC (7 states) or SBAC (14 states) assessments, the same number as last year. Three states will give tests that are a blend of locally-developed questions and items from PARCC or SBAC. The remaining 27 states will administer tests they developed themselves or purchased from vendors.

The drop off in consortia-inspired testing has been sharpest at the high school level. Only 15 of the 21 states using tests from the consortia will use those tests at the high-school level. There has been a corresponding rise in the number of states who will use the ACT or SAT as their high school test. As part of their efforts to increase awareness of and equitable access to higher education, states have increasingly been requiring all students to take the ACT or SAT. Half the states (25) now have such a requirement in place. Twelve of those states also use the SAT or ACT for federal accountability purposes. Ed Week notes that in most of those states, the college-admissions tests have become the only way states measure and report high school achievement to the federal government. With ESSA now clearly stating that states can use a "nationally recognized high school academic assessment" in place of state-designed academic assessments, it seems likely that more states will get on that bandwagon. It's less expensive, takes less time and cuts down on the number of tests that high schoolers have to contend with. 

The Center on Education Policy (CEP) at the George Washington University just released results of a survey conducted in early 2016 that asked superintendents about their districts' experiences with preparing for and administering CCSS-aligned assessments in the spring of 2015. There's a good bit of interesting data in this report, but the state assessment picture has changed so dramatically over the past two years, that it's hard to know how to use the information to think about future experiences with assessment. It is striking, however, to note that even for this first administration of the Common Core assessments (which was not exactly a smooth roll out) about half of district leaders agreed that new assessments did a better job than their previous states tests when it came to measuring higher-order analytical and performance skills. About 40% said the new tests were driving instruction in positive ways. One wonders if the decision about which test to use had been left in the hands of district leaders, if the decisions to move to new tests would have come as quickly.

One data point that probably has not changed much indicates that district leaders agree that students spend too much time taking all types of school tests. Specifically, 72% agreed that students were tested too much at the elementary level, 66% at the middle school level and 63% at the high school level. Nevertheless schools did spend time preparing students for the tests. More than half (58%) of district leaders said that the average student in their district spent one week or less on activities designed to prepare them for the spring 2015 Common Core assessments. Fourteen percent reported that students spent more than one week, but less than two in test prep, 9% said students spent more than two weeks but less than a month and 10% said that students spent a month or more on test prep.

If the Center on Education Policy continues to focus research efforts on standards and assessment, it will be interesting to see how much things change when districts report on their 2016 or 2017 assessment experiences. I would hope that ESSA's reduced emphasis on assessment as a way to evaluate schools and teachers might lead to less anxiety in classrooms about test prep. But my experience says that we still need to engage in a serious discussion about assessment, what it can and can't do and how we might best use it to truly guide instruction. I don't think we've come anyplace close to resolving the really big questions around student achievement and the role of assessment.