Sufficient Dimension Reduction in Regression and Classification: 
An overview and recent results for matrix-valued predictors

 
Abstract

We consider the general regression/classification problem of fitting
a response of general form (univariate, multivariate, tensor-valued)
on predictors of general form. We operate in the context of sufficient
dimension reduction (SDR) where predictors are replaced by sufficient 
reductions without loss of information. SDR methodology includes 
likelihood and non-likelihood based methods. The former assume knowledge
 either of the joint family of distributions of the response and 
the predictors, or of the conditional family of distributions for 
the predictors given the response. The most researched branch of sufficient
dimension reduction is non-likelihood based and contains three classes
of methods: Inverse regression based, semi-parametric and nonparametric. 
A high-level review of these approaches will be presented. The focus will
 be on likelihood-based SDR which guarantees exhaustive and maximum 
dimension reduction. The case with matrix-valued predictors, with 
generalization to tensor-valued predictors, will be presented in more detail.