“…Thus our generalizations rst established in Section 2.2, and expanded upon in Discussionaccomodate: almost any data type, various metrics (e.g., Hellinger distance), various optimizations (e.g., PLS, CCA, or RDA type optmizations), and even two strategies for ridge-like regularization. We have foregone any discussions of inference, stability, and resampling for PLS-CA-R because, as a generalization of PLS-R, many inference and stability approaches still applysuch as feature selection or sparsication (Sutton et al 2018), additional regularization or sparsication approaches (Le Floch et al 2012, Guillemot et al 2019, Tenenhaus et al 2014, Tenenhaus & Tenenhaus 2011, cross-validation (Wold et al 1987, Rodríguez-Pérez et al 2018, Kvalheim et al 2019, Abdi 2010, permutation (Berry et al 2011), various bootstrap (Efron 1979, Chernick 2008 approaches (Abdi 2010, Takane & Jung 2009 or tests (McIntosh & Lobaugh 2004, Krishnan et al 2011, and other frameworks such as split-half resampling (Strother et al 2002, Kovacevic et al 2013, Strother et al 2004)and are easily adapted for the PLS-CA-R and GPLS frameworks.…”