May 2017 S M T W T F S « Sep 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 Pages
 1bit measurements Bregman iteration classification clustering compressed sensing computer vision conical hull problem convex optimization dimension reduction DivideandConquer elastic net fast algorithm fast SVD feature selection fixed point continuation game greedy search group sparsity Hamming Compressed Sensing iterative thresholding Kmeans latent variable model lowrank manifold learning matrix completion matrix factorization multilabel learning Ncut Nesterov's method NIPS 2011 Nonnegative Matrix Factorization optimization Quantization recovery randomized optimization robust principal component analysis SDP Separable assumption sparse learning Spectral clustering structured learning SVM
ClustrMaps

What’s new
 List of Submodular Optimization on Streaming Data (In Update)
 DivideandConquer Learning by Anchoring a Conical Hull
 Multitask Copula – A semiparametric joint prediction model for multiple outputs with sparse graph structure
 NeSVM (Nesterov’s method for SVM) code for our ICDM 2010 paper
 AISTATS 2013 GreBsmo code is released
Articles
Category Archives: Tianyi’s work
DivideandConquer Learning by Anchoring a Conical Hull
Many wellknown machine learning methods aim to draw a line between two classes. However, in our recently accepted NIPS 2014 paper “DivideandConquer Learning by Anchoring a Conical Hull“, we reduce lots of fundamental machine learning problems (a broad class of … Continue reading
Multitask Copula – A semiparametric joint prediction model for multiple outputs with sparse graph structure
Our paper “Multitask Copula by Sparse Graph Regression“ has been accepted by KDD 2014 this year. So we can talk at the conference which is at NYC, between August 2427. Before that, let me introduce this new method. In summary, we tackle … Continue reading
Posted in Tianyi's work
Tagged fast algorithm, Hamming Compressed Sensing, structured learning
Leave a comment
NeSVM (Nesterov’s method for SVM) code for our ICDM 2010 paper
You can now download MATLAB code for NeSVM from here. In the code, options.mu is a key parameter to adjust the tradeoff between consistent decreasing of primal object function, and the speed. So you need to roughly tune it to … Continue reading
AISTATS 2013 GreBsmo code is released
Here is the GreBsmo code for our AISTATS 2013 paper. You can use it as a greedy version of GoDec solver for X=L+S problem. It is much faster and more robust. There are three video subsequences you can play in … Continue reading
[Best student paper award] Welcome to my “DivideandConquer Anchoring (DCA)” talk at ICDM Dallas Dec 8
Is it possible to finish a 60000×10000 matrix decomposition (NMF, PCA, etc) or completion in 6 seconds on your laptop’s matlab? Can we make it even faster by a simple distributable scheme? How to summarize a hugescale dataset (ratings, movie, … Continue reading
Our DMKD paper is selected as Top 5 Editor’s Choice Article for Free Reading
Prof. Geoff Webb, the EditorinChief of Data Mining and Knowledge Discovery (Springer) announced in his kdnuggets website that our paper “Manifold Elastic Net: A Unified Framework for Sparse Dimension Reduction”, which was published on DMKD journal in 2011 and cited … Continue reading
Posted in Tianyi's work
Tagged dimension reduction, elastic net, fast algorithm, feature selection, manifold learning, sparse learning
1 Comment
Greedy Bilateral (GreB) Paradigm for Largescale Matrix Completion, Robust PCA and Lowrank Approximation
Our paper “Greedy Bilateral Sketch, Completion and Smoothing” has been accepted by AISIATS 2013. Abstracts reads below, PDF is here, and code will be coming soon. Abstract: Recovering a large lowrank matrix from highly corrupted, incomplete or sparse outlier overwhelmed … Continue reading
Compressed Labeling: An important extension of Hamming Compressed Sensing; at NIPS now
We are just informed that our submission “Compressed Labeling (CL) on Distilled Labelsets (DL) for Multilabel Learning” is accepted by Machine Learning Journal (Springer). Online first PDF can be downloaded here. CL is an important application and extension of Hamming … Continue reading
SemiSoft GoDec: >4 times faster, autodetermined k
Here is a good news of GoDec (pertaining to our ICML 2011 paper): SemiSoft GoDec is released. Different from the ordinary GoDec which imposes hard threshholding to both the singular values of the lowrank part L and the entries of the … Continue reading
Hamming Compressed Sensingrecovering kbit quantization from 1bit measurements with linear noniterative algorithm
We developed a new compressed sensing type signal acquisition paradigm called “Hamming Compressed Sensing (HCS)” to recover signal’s kbit quantization rather than itself. Directly recovering quantization is much more preferred in practical digital systems. HCS provides a linear, noniterative quantization … Continue reading
Posted in Tianyi's work
Tagged 1bit measurements, compressed sensing, fast algorithm, Quantization recovery
3 Comments