4 Aspects

4.1 Definition

An aspect is a real valued function \(\phi\) defined on the compact convex set \(\mathcal{R}^{m\times m}\) of correlation matrices of order \(m\). Note that a correlation matrix is a positive semi-definite matrix with ones on the diagonal.

In De Leeuw (2004) a class of MVAOS techniques is defined by optimizing aspects, using majorization algorithms. Optimization is over a set \(\mathcal{R}\) of correlation matrices, usually the correlation matrices that correspond with admissible transformations of the data. See De Leeuw (1988) and De Leeuw, Michailidis, and Wang (1999) for additional results on aspects. Software in R that optimizes general aspects is discussed by Mair and De Leeuw (2010).

The aspect optimization algorithm is based on majorization, and assumes that the aspect that is maximized is a convex function on the space of correlation matrices (or, equivalently, that the aspect is concave and minimized). Let’s give examples of some interesting convex aspects.

  • The sum of the \(p\) largest eigenvalues of the correlation matrix (as in principal component analysis).
  • The squared multiple correlation (SMC) of one variable with the others (as in multiple regression).
  • The sum of the SMC’s over some or all variables (as in path analysis).

There are also some convex aspects are not directly associated with a standard multivariate technique.

  • The sum of the \(p^{th}\) powers of the correlation coefficients, with \(p\geq 1\).
  • The sum of the \(p^{th}\) powers of the absolute values of the correlation coefficients, with \(p\geq 1\).
  • Any norm on the space of correlation matrices.

Another interesting aspect, related multinormal maximum likelihood estimation, is \[ \phi(R)=\min_{\Gamma\in\mathcal{G}} \log\mathbf{det}(\Gamma)+\mathbf{tr}\ R\Gamma^{-1}, \] where \(\mathcal{G}\) is some (possibly parametrized) subset of the correlation matrices. For instance, \(\mathcal{G}\) could be all matrices satisfying some factor analysis or structural equations model. To compute \(\phi\) we have to calculate the multinormal maximum likelihood estimate of the model. As an aspect \(\phi\) is concave in \(R\), so in our framework we minimize it over \(\mathcal{R}\).

4.2 Stationary Equations

The stationary equations when optimizing aspect \(\phi\) over the centered and standardized transformations in \(x_j\) are \[ \sum_{\ell=1}^m \frac{\partial\phi}{\partial r_{j\ell}}\mathbf{E}(x_\ell|x_j)=\lambda_jx_j \]

4.3 Bilinearizability

References

De Leeuw, J. 2004. “Least Squares Optimal Scaling of Partially Observed Linear Systems.” In Recent Developments in Structural Equation Models, edited by K. van Montfort, J. Oud, and A. Satorra. Dordrecht, Netherlands: Kluwer Academic Publishers. http://www.stat.ucla.edu/~deleeuw/janspubs/2004/chapters/deleeuw_C_04a.pdf.

De Leeuw, J. 1988. “Multivariate Analysis with Linearizable Regressions.” Psychometrika 53: 437–54. http://www.stat.ucla.edu/~deleeuw/janspubs/1988/articles/deleeuw_A_88a.pdf.

De Leeuw, J., G. Michailidis, and D. Y. Wang. 1999. “Correspondence Analysis Techniques.” In Multivariate Analysis, Design of Experiments, and Survey Sampling, edited by S. Ghosh, 523–47. Marcel Dekker. http://www.stat.ucla.edu/~deleeuw/janspubs/1999/chapters/deleeuw_michailidis_wang_C_99.pdf.

Mair, P., and J. De Leeuw. 2010. “A General Framework for Multivariate Analysis with Optimal Scaling: The R Package aspect.” Journal of Statistical Software 32 (9): 1–23. http://www.stat.ucla.edu/~deleeuw/janspubs/2010/articles/mair_deleeuw_A_10.pdf.