How does standards-based grading affect standardized test scores?

A teacher from a school in Iowa recently contacted me with an inquiry,

Is there any research indicating students going through a standards based grading system show improved standardized test scores  or college readiness compared to students that go through a traditional grading school?

Context: What we know about educational research

This is an excellent question that comes up often when educators and their school systems are considering a shift towards standards-based grading. Based upon my knowledge of research methodology, it would be challenging (but not impossible) to conduct a study truly isolating standards-based grading as a single variable influencing a school’s student achievement data. In other words, we know that in any educational study, limitations exist (i.e. fidelity of implementation, additional initiatives simultaneously assisting or hindering test scores, the alignment between state standards taught and standards assessed, and any placebo effect).  Finally, any experienced educational researcher will readily admit the challenge of generalizing study results outside of the original context.

How does standards-based grading affect standardized test scores?

I scoured the literature as part of my recent dissertation and found only a few quantitative studies that consider this type of question (How does SBG affect standardized test scores?), some of which look at an entire school and others that look at a smaller section of a school, such as an individual classroom. Each has clearly stated the study limitations.  Here’s a quick copy/paste from my dissertation:

The impact of standards-based grading on external achievement measures is mixed. While some studies indicate a significant statistical difference in one or more content areas (e.g. Haptonstall, 2010; Pollio & Hochbein, 2015), others have not (e.g. Rosales, 2013; Welsh, D’Agostino, & Kaniskan, 2013).

To add additional clarity, I’ve added more detail below related to each of the aforementioned citations.

Haptonstall (dissertation): From the abstract of the study, emphasis mine:

This study examined the correlation between the grades a student earns in his or her classroom and the scores that each student earned on the Colorado Student Assessment Program tests, in Reading, Writing, Math, and Science. The study also examined the mean scores of varying sub-groups to determine if certain sub-groups demonstrated higher means, dependent of the school districts that they were enrolled. While all the school districts that participated in the study showed a significant level of correlation between grades and test scores, Roaring Fork School District Re-1, using a standards- based grading model demonstrated both higher correlations and higher mean scores and grades across, the overall population and sub-groups.

Pollio & Hochbein: From the abstract of the study:

Results indicated that the rate of students earning an A or B in a course and passing the state test approximately doubled when utilizing standards-based grading practices. In addition, results indicated that standards-based grading practices identified more predictive and valid assessment of at-risk students’ attainment of subject knowledge.

Rosales: From the dissertation study, emphasis mine:

This study seeks to determine whether standards-based grading has the same effect on students at the high school level (grades 9-12) by comparing end-of-course test scores and posttest scores of Algebra 2 students enrolled in a standards-based graded classroom with to those enrolled in a traditionally-graded classroom. Two teachers each taught two classes of Algebra 2 and graded one class using standards-based grading and one class using traditional grading methods. Students at both the honors level and the regular level of mathematics were included in the study.
Honors students performed better than regular students on both assessments, but no significant difference was found between the performance of traditionally-graded students and the students who were graded with standards-based grading.  The results of this study indicate that standards-based grading may offer improved methods of communication between teachers, parents, and students and may give students a new perception of learning.

Welsh, et al.: From the abstract, emphasis mine:

Standards-based progress reports (SBPRs) require teachers to grade students using the performance levels reported by state tests and are an increasingly popular report card format. They may help to increase teacher familiarity with state standards, encourage teachers to exclude nonacademic factors from grades, and/or improve communication with parents. The current study examines the SBPR grade–state test score correspondence observed across 2 years in 125 third and fifth grade classrooms located in one school district to examine the degree of consistency between grades and state test results. It also examines the grading practices of a subset of 37 teachers to determine whether there is an association between teacher appraisal style and convergence rates. A moderate degree of grade–test score convergence was observed using three agreement estimates (coefficient kappa, tau-b correlations, and classroom-level mean differences between grades and test scores). In addition, only small amounts of grade–test score convergence were observed between teachers; a much greater proportion of variance lay within classrooms and subjects.

Given inconclusive evidence, why would a teacher or school decide to embark upon standards-based grading?

Here’s what I’ve found: First, the absence of research supporting traditional grading practices is concerning. Too many times, stakeholders enter these types of conversations assuming traditional grading practices are some type of research-proven paradigm when in fact, schools inherited the traditional grading system over one hundred years ago (Cureton, 1971).  Next, beyond the limited quantitative studies related to standards-based grading available right now (and summarized above), classroom teachers can do a better job aligning what we teach with our assessments, a fundamental tenet of grading reform.  In other words, due to a dearth of evidence, educators should strongly consider pragmatic solutions.  For more on these pragmatic themes and their connection to scholarly literature, I encourage readers to consider, “What does the research say about standards-based grading?” a research primer I co-wrote with Dr. Tom Buckmiller several years ago.  Finally, standards-based grading provides students and parents with more useful information about current levels of work (“proficient understanding of Pythagorean’s Theorem”) when compared to traditional grading practices (“85% on the Chapter 3 Test”).

In summary, the impact of standards-based grading on external achievement measures is mixed. With limited quantitative studies available to base this conclusion upon, educators should consider pragmatic benefits of standards-based grading practices and urge educational researchers to respond to this question with more detail in the future.

Works Cited

Cureton, L. W. (1971). The history of grading practices. NCME Measurement in Education, 2(4), 1-8.

Haptonstall, K. G. (2010). An analysis of the correlation between standards-based, non standards-based grading systems and achievement as measured by the Colorado Student Assessment Program (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses. (UMI No. 3397087)

Pollio, M. & Hochbein, C. (2015). The association between standards-based grading and standardized test scores as an element of a high school model reform. Teachers College Record, 117(11), 1-28.

Rosales, R. B. (2013). The effects of standards-based grading on student performance in Algebra 2 (Doctoral dissertation). Retrieved from http://digitalcommons.wku.edu/diss/53/

Welsh, M. E., D’Agostino, J. V., & Kaniskan, B. (2013). Grading as a reform effort: Do standards-based grades converge with test scores? Educational Measurement: Issues and Practice, 32(2), 26-36. doi:10.1111/emip.12009

Leave a Reply