Our paper “Multi-task Copula by Sparse Graph Regression“ has been accepted by KDD 2014 this year. So we can talk at the conference which is at NYC, between August 24-27. Before that, let me introduce this new method.
In summary, we tackle two challenging learning problems:
1) How to model a rich class (non-Gaussian, asymmetric, etc.) of joint likelihood function with multiple outputs? Our Answer: multi-task copula.
2) How to (efficiently) estimate a sparse graph function G(Y, E) = F(X), i.e., the sparse dependency graph of (outputs) Y as a function of X (inputs)? Our Answer: sparse graph regression.
This is the first copula model for multi-task or multi-output regression problem (Yes yes yes I hear someone whispering “Ceee-Rrrr-Ffff”). The separable semiparametric form of copula model allows us to select arbitrary distribution class to model the marginal likelihood p(Y_i|X), and model the non-Gaussian dependency between outputs by estimating a Gaussian precision matrix (normally seen in covariance selection, Gaussian graphical model, etc.). In addition, it enables a two-stage learning scheme which learn the marginals and conditional dependency graph separately.
Sparse graph regression (SpaGraphR) is a fast tool to learn the dependency graph. However, it is also an independent interest to others who need to ahiceve a dependancy graph varying with input/time/other factors. In statistics this problem is usually called “covariance regression”, while we are seeking here is a “covariance regression with sparse graph structure”, i.e., the covariance function is smooth but its corresponding graph is usually sparse. Given m input instances, the evaluation of this function gives you m sparse graphs of outputs, and similar inputs lead to similar graph structure. This is much harder than “covariance selection” which only need to estimate one graph. However, we show that our algorithm can use similar (or even less) computations than covariance selection to find a nonparametric estimator of covariance function.
Here comes the Abstract of our paper:
This paper proposes multi-task copula (MTC) that can handle a much wider class of tasks than mean regression with Gaussian noise in most former multi-task learning (MTL). While former MTL emphasizes shared structure among models, MTC aims at joint prediction to exploit inter-output correlation. Given input, the outputs of MTC are allowed to follow arbitrary joint continuous distribution. MTC captures the joint likelihood of multi-output by learning the marginal of each output firstly and then a sparse and smooth output dependency graph function. While the former can be achieved by classical MTL, learning graphs dynamically varying with input is quite a challenge. We address this issue by developing sparse graph regression (SpaGraphR), a non-parametric estimator incorporating kernel smoothing, maximum likelihood, and sparse graph structure to gain fast learning algorithm. It starts from a few seed graphs on a few input points, and then updates the graphs on other input points by a fast operator via coarse-to-fine propagation. Due to the power of copula in modeling semi-parametric distributions, SpaGraphR can model a rich class of dynamic non-Gaussian correlations. We show that MTC can address more flexible and difficult tasks that do not fit the assumptions of former MTL nicely, and can fully exploit their relatedness. Experiments on robotic control and stock price prediction justify its appealing performance in challenging MTL problems.