## 6.1What is PLSC?

Purpose:
PLSC - Partial Least Square (Covariance) is a statistical technique that takes root in PCA and can be used to analyze the relationship between two sets of variables of the same observation. While PCA assesses the Maximum Inertia (largest variance of observation on each component), PLSC assesses the Maximum Covariance between 2 set of variables. This can also be thought of as an assessment of correlations between the two sets of variables, given the condition that the 2 datasets are centered and scaled.

Essential Steps:
1. PLSC uses SVD to find Weights P of data table 1 Weights Q of data table 2. This step aims to find commonality.
2. Find latent variables (aka new axis or Component) by multiplying Weights with their respective Data table. This steps aims to maximize the Covariance by looking at the Diag Matrix of SVD. We get Latent 1 Lx = XP and Latent 2 Ly = YP.

Notes on Some Important Aids:
1. Components or Dimensions are typically called Latent in the context of PLSC.
2. Scree Plot: in the form of heatmap and display relationship of the covariance of the 2 sets of variables.
3. Saliances: find the square-root of eigenvalues in the singular values. The goal of saliances is to help find the relationship between varaibles on each latent and how good is the component. I think of this as the equivalence of Loadings in PCA.

Author’s Notes:
PLSC is a great addition to our Multivraite toolbox. PLSC is very versatile in that it can relate data that are normally cannot be studied together since it can relate 2 data tables at a time. Thus, there are many variations of PLS such as PLSC that uses Covariance (Behavior PLSC, Task PLSC, Seed PLSC, Saptio-Temporal PLSC), PLS for Distances, PLS for Qualitative Data (PLS-CA), PLS for multiple data tables. In term of PLS realtion to Machine Learning, PLSC is most similar to feedforward artificial neural networks or backpropagation.Finding commonality using covariance is the bedrock of Computer Vision and many other object detection research.