10.4 Classification Accuracey : Monte Carlo Approach Using R
To measure the classification accuracy, we usually us two different approaches(Paulsen et al., 2020) :
Attribute-level classification accuracy(ACA)
Profile Classification Accuracy (PCA)
The ACA and PCA can be generally calculated as :
\[\mathrm{ACA}=\frac{1}{N \times K} \sum_{i=1}^N \sum_{k=1}^K I\left[\widehat{\alpha}_{\mathrm{ik}}=\alpha_{\mathrm{ik}}\right]\] and,
\[\mathrm{PCA}=\frac{1}{N} \sum_{i=1}^N I\left[\widehat{\alpha}_i=\alpha_i\right]\] here,
- \(\hat{\alpha}_{ik}\) is the estimated mastery status for examinee \(i\) on \(k-th\) attribute.
- \({\alpha}_{ik}\) is the true mastery status for examinee \(i\) on \(k-th\) attribute.
- \(\hat{\alpha}_{i}\) is the estimated mastery profile for examinee \(i\).
- \(\alpha_{i}\) is the true mastery profile for examinee \(i\).
Note:
- Both ACA and PCA ranges between 0 to 1. Both these measures indicate the proportion of estimated attributes or profiles that is identical to the true classification.
Let us explore an example:
Johnson and Sinharay (2018) fit a loglinear CDM to a set of data from the Examination for the Certificate of Proficiency in English (ECPE) Grammar Test, which includes the responses of 2,922 examinees to 28 items measuring knowledge of (1) morphosyntactic rules, (2) cohesive rules, and (3) lexical rules. The data set and Q-matrix can be obtained by the following code:
To perform the Monte Carlo method, we fit the logit link G-DINA model to the data (Step 1)
In Step 2, estimated item parameters and proportion parameters can be obtained below
Code
In Step 3, we simulate data:
Code
Step 4 estimates attribute profiles of the simulated data, while fixing item parameters
Code
Step 5 compares the estimated attribute profiles with the simulated ones.
Code
## A1 A2 A3
## 0.9127 0.8680 0.9338
Explore why the attribute pattern level accuracy of the MAP estimation can be assessed using the code below:
Code
## Group.1 x
## 1 1 0.91431
## 2 2 0.05366
## 3 3 0.00000
## 4 4 0.46246
## 5 5 0.44890
## 6 6 0.19371
## 7 7 0.63537
## 8 8 0.91974