Published: March 28, 2024 By

Over the past four years, CADRE has been involved in a Research-Practice Partnership with assessment specialists from the organization Curriculum Associates on a project that is exploring innovations in the presentation and interpretation of student growth from large-scale educational assessments. CADRE’s central innovation in this space is what we have come to describe as taking a “content-referenced growth reporting” approach, in contradistinction from the two more predominant approach to thinking about student growth (norm-referenced and criterion-referenced growth). The idea, in a nutshell, is to shift attention from growth interpretations that emphasize “points earned” to growth interpretations that focus on conceptual changes in student thinking, changes that can be characterized in concrete terms by the content of the assessment in the form of exemplar items.

The embodiment of this idea is found in a prototype we have developed for an interactive digital score reporting interface. CRG reporting is intended to help teachers understand the changes in thinking that occur as students advance in their learning of different skills and standards across content areas. The goal is for teachers to identify common ways that students think about problems, and to connect changes in this thinking to the growth a student has made in assessment scores. 

The reporting interface is designed to accomplish this by giving users the opportunity to visualize and interpret student growth across multiple assessment occasions, using a score scale whose interpretation is facilitated by qualitatively distinct reference locations. Importantly, we also want the reporting interface to be informed by cognitive principles for the effective representation of quantitative data, and by the intended answers—as well as new questions—that the interface should provide and elicit about student growth.

To get the gist of what we are trying to accomplish, we invite the reader to interact with version 1.0 of CADRE’s digital reporting prototype which can be accessed at .

Two of the major inspirations for the CRG approach come from item mapping, as instantiated in the , and the research literature on learning progressions (e.g., Clements & Sarama, 2004; Alonzo & Gotwals, 2012;). Other related explications of the ideas behind CRG can be found in Briggs & Peck (2015), Briggs (2019) and Wilson (2023). 
Figure 1. Theory of Action Behind CRG ReportingFigure 1. Theory of Action Behind CRG Reporting

A high-level theory of action for CRG reporting is outlined in Figure 1. The top of the diagram starts with the end goal which is to facilitate improved learning outcomes for students. This is accomplished through teachers adjusting their instruction based upon the inferences about growth and learning they can make from using the CRG reporting interface. There are two secondary goals of the prototype; 1) to contribute to teachers’ professional learning, and 2) to improve attitudes about the usefulness of assessment. We believe that as teachers attempt to connect student growth to changes along an underlying learning progression, they will also be engaging in a process of professional learning that deepens their content area expertise. Similarly, as teachers learn—with support—to interact with the reporting interface, we hope they develop an improved relationship with assessment and data use. In the Figure 1 diagram, the foundation for the CRG reporting interface depicted by four building blocks: a domain-specific learning progression, targeted test design, a scale established using item response theory (i.e., the Rasch model), and items mapped from the scale to levels of the learning progression.

To date we have developed two different prototypes for a CRG reporting interface specific to learning progressions for the core concepts of fractions and phonics knowledge within the content domains of mathematics and reading. Reports on learning progression development and validation in these two areas can be found in Wellberg, Briggs & Student (2023) and Cox & Briggs (2023). Additional prototypes are in development in mathematics for algebraic functions and geometric measurement. 

We have engaged in two rounds of interviews with teachers in 2022 and 2023 as they have interacted with the CRG protoypes for fractions and phonics. The results from these interviews have helped us to refine our theory of action (e.g., the details of what we expect in the arrows from reporting interface, educator interaction and inferences about learning and growth shown in Figure 1) and have motivated ongoing revisions that will lead to version 2.0 of the CRG reporting protype (currently in progress). For reports on these interviews and lessons learned, see Briggs et al. (2023) and Buchbinder et al. (2023).  

A full manuscript describing the CRG approach is currently in progress with plans to submit for peer-review by July 2024. In the meantime, we will be presenting this work as part of a symposium at the annual meeting of the National Council on Measurement in Education in Philadelphia on Friday April 12 (“Challenges and Innovations in Creating Interactive Reports of Student Progress and Growth”) and also at the annual National Conference on Student Assessment in Seattle in June.

The CADRE PI for the CRG research project is Derek Briggs. Major contributions have come from CADRE graduate student researchers (in alphabetical order) Nicolas Buchbinder, Olivia Cox, Kyla McClure, Sarah Wellberg and Erik Whitfield.

References

Briggs, D. C. & Peck, F. A. (2015). Using learning progressions to design vertical scales that support coherent inferences about student growth. Measurement: Interdisciplinary Research & Perspectives, 13, 75-99.

Briggs, D. C. (2019). Interpreting and visualizing the unit of measurement in the Rasch Model. Measurement, 46 (2019) 961–971.

Clements, D. H., & Sarama, J. (2004). Learning Trajectories in Mathematics Education. Mathematical Thinking and Learning, 6(2), 81–89.

Alonzo, A. & Gotwals, A. (2012). Learning progressions in science: current challenges and future directions (2012). SensePublishers.

Wilson, M. (2023). Constructing measures : an item response modeling approach. Routledge.