User Tools

Site Tools


candidates

From Zoltan

Peter Richtarik (“http://www.maths.ed.ac.uk/~richtarik/”) gave a nice talk on optimization last week at the UCL Big Data workshop. The papers with the details are:

  • Coordinate descent with arbitrary sampling I: algorithms and complexity (“http://www.maths.ed.ac.uk/~richtarik/papers/alpha1.pdf”),
  • Coordinate descent with arbitrary sampling II: expected separable overapproximation (“http://www.maths.ed.ac.uk/~richtarik/papers/alpha2.pdf”),
  • Idea: they propose a unified randomized block-coordinate descent type framework with arbitrary sampling (=which coordinates to optimize at a given iteration) and its analysis. In certain cases the optimal block generating distribution can also be analytically determined. It fits nicely to parallel and distributed architectures, as an example they solved a Lasso problem with variables in the order of billions;)

At NIPS I had a chat with the second author (“http://www-scf.usc.edu/~hsedghi/”); it also came up while Anima (“http://newport.eecs.uci.edu/anandkumar/”) visited us last week:

  • Score Function Features for Discriminative Learning: Matrix and Tensor Framework (“http://arxiv.org/abs/1412.2863”).
  • Idea: feedforward neural networks can be trained using spectral methods and higher order extension of score functions.
candidates.txt · Last modified: 2015/01/20 12:55 by wittawat