Skip to main content
stat
Fused lasso in graph estimation problems
Add to Calendar 2019-01-22T20:30:00 2019-01-22T21:30:00 UTC Fused lasso in graph estimation problems Thomas Bldg
Start DateTue, Jan 22, 2019
3:30 PM
to
End DateTue, Jan 22, 2019
4:30 PM
Presented By
Oscar Madrid Padilla, University of California, Berkeley
Event Series:

The fused lasso, also known as (anisotropic) total variation denoising, is widely used for piecewise constant signal estimation with respect to a given undirected graph.  In this talk I will describe theory and methods for the fused lasso. Two classes of problems will be discussed: denoising on graphs, and nonparametric regression on general metric spaces. For the first of these tasks, I will provide a general upper bound on the mean squared error of the fused lasso that depends on the sample size and the total variation of the underlying signal. I will show that such upper bound is minimax when the graph is a tree of bounded degree, and I will present a surrogate estimator that attains the same upper bound and can be found in linear time. The second part of the talk will focus on extending the fused lasso to general nonparametric regression. The resulting approach, which we call the K-nearest neighbors (K-NN) fused lasso, involves (i) computing the K-NN graph of the design points; and (ii) performing the fused lasso over this K-NN graph. I will discuss several theoretical advantages over competing approaches: specifically, the estimator inherits local adaptivity from its connection to the fused lasso, and it inherits manifold adaptivity from its connection to the K-NN approach. Finally, I will briefly mention some of my other research directions.