Traditionally, states have evaluated student performance using end-of-year summative assessments, which produce cut scores that classify examinees into broad performance categories, such as, “Nearing Proficiency” or “Proficient.” Such classifications indicate what students know and can do, but they do not provide specific, targeted statements about examinee abilities. In addition, because summative assessments are administered at the end of the year when students are about to move on to another grade, they provide little opportunity for intervention during the school year.
Enter Cognitive Diagnostic Models (CDMs), a class of psychometric methods that provide a different way to describe how an examinee performed on a test. Rather than reporting a scaled score, CDMs provide a score profile that describes what specific skills and concepts on the test an examinee appears to have mastered or not mastered. The entire idea is to offer diagnostic information to educators and parents that can guide instruction for immediate intervention instead of providing a broad statement about performance like scaled scores do.
CDMs can be particularly useful for formative assessments, such as interim and through-year assessments. While not new, interest in CDMs in school districts and the measurement community is growing. Additionally, states have begun piloting more frequent, interim assessments through the U.S. Department of Education’s Innovative Assessment Demonstration Authority (IADA) program. Districts applied for the program with plans using innovative, interim assessment methods, such as CDMs, that are tailored to individual student needs.
Examinee Skills and Concepts Profiles
Rather than listing an overall total score, as is typical end-of-year score reports, the main result of a CDM is a student “skill profile.” The profile includes multiple columns with specific skills, such as subtracting two-digit numbers or decoding multisyllabic words and assigns a number under each individual skill being assessed. For example, a “1” indicates that there is evidence the examinee has demonstrated mastery of the skill or concept and a “0” indicates that there is no evidence of mastery. Some score profiles might replace the 1s and 0s with the probabilities of mastery estimated by the CDM.
CDM profiles indicate the presence or absence of specified skills and concepts. They are intended to guide instructional planning for each individual student. Given their relatively immediate impact on individual students, ensuring the quality of such assessments is of great importance, as explained in the HumRRO blog, Ensuring the Quality of Nontraditional Educational Assessment Systems.
CDMs for Formative Assessment:
Interim and Through-Year Assessments
We believe the application of CDM analysis is most fruitful at the local level where formative assessments are tailored to instruction and can provide actionable feedback to teachers about which of their students have probably mastered which content standards—that is, learning outcomes—and their current learning needs. The most prevalent commercial offerings can be referred to as interim assessments. The hottest topic in discussions these days is through-year assessments (TYAs).
TYA designs are receiving widespread attention at the moment. The prevailing design concept here is that a fall version of this formative assessment, for example, would assess only those content standards that are covered during the fall instructional period. The TYA administered in the winter might assess only those content standards that were covered during that instructional period and so forth. TYAs are often referred to as “curriculum-aligned” or “instructionally-aligned” and follow school district curriculum pacing guides.
CDM analysis and reporting may be most appropriate for TYAs, where the Q-matrix can focus on skill and concept profiles that are consistent with the content standards covered during a specific period of instruction. Vendors must offer considerable flexibility in creating Q-Matrices that align with differences among school district pacing guides.
For a more detailed analysis, please read, Cognitive Diagnostic Models (CDMs): A Gentle Introduction and Exploration of Their Potential Use for Formative Assessment, by HumRRO Vice President, Harold Doran, Ed.D., Principal Scientist Steve Ferrara, Ph.D., Senior Scientist Hye-Jeong Choi, Ph.D., and Research Scientist, Nnamdi Ezike, Ph.D.
In the paper, we describe CDMs and offer considerations relevant to their practical, operational use in state and district assessment programs. We start from the proposition that the term “diagnostic” refers to providing actionable feedback on skills and concepts that students likely have mastered, and their specific learning needs. To us, this is about formative assessment, in that teachers can use the feedback from CDMs to design instruction around learners’ needs. HumRRO’s expert psychometric services can assist clients in establishing and improving the quality of their assessment programs.