7.3 Relative Model-Data Fit at Test Level (Cont’d)
Let P be the number of model parameters, several information criteria can be defined:
Akaike (1974) Information Criterion (AIC) adjusts the -2 log likelihood by twice the number of parameters in the model: AIC=−2logL(Y)+2P
Schwarz (1978) Bayesian Criterion (BIC) has a stronger penalty than the AIC for overparametrized models, and adjusts the -2 log likelihood by the number of parameters times the log of the number of cases. It is also known as the Bayesian Information Criterion. BIC=−2logL(Y)+Plog(N) Bozdogan (1987) Consistent Akaike’s Information Criterion (CAIC) has a stronger penalty than the AIC for overparametrized models, and adjusts the -2 log likelihood by the number of parameters times one plus the log of the number of cases. As the sample size increases, the CAIC converges to the BIC.
CAIC=−2logL(Y)+P[log(N)+1]
The sample-size-adjusted BIC (SABIC) is proposed by Sclove (1987) to reduce the penalty in BIC. SABIC=−2logL(Y)+P[log(N+224)]
7.3.1 Calculating number of model paraters for CDMs
Number of model parameters (P) can be calculated in different ways. For instance,
For DINA model, P=2J+2k−1, here J is the number of items.
For ACDM model, ∑Jj=1K∗j+J+2k−1, here K∗j is the reduced attribute profiles.
For Saturated models (e.g., GDINA), ∑jj=12Kkj+2k−1
We can also identify the number of model parameters from GDINA R-package, as demonstrated in Model-Identifiability class.