Seminar by İlker Birbil

İlker Birbil
26/02/2016
13:30
-
13:30

Scalable Distributed Tensor Factorizations with Incremental Quadratic Approximations

Seminar by
İlker Birbil
Sabancı University

Abstract:

We propose HAMSI, a provably convergent incremental algorithm for solving large-scale partially separable optimization problems that frequently emerge in machine learning and inferential statistics. The algorithm is based on a local quadratic approximation and hence allows incorporating a second order curvature information to speed-up the convergence. Furthermore, HAMSI needs almost no tuning, and it is scalable as well as easily parallelizable. In large-scale simulation studies, we illustrate that the method is superior to a state-of-the-art distributed stochastic gradient descent method in terms of convergence behavior. This performance gain comes at the expense of using memory that scales only linearly with the total size of the optimization variables. We conclude that HAMSI may be considered as a viable alternative in many scenarios, where first order methods based on variants of stochastic gradient descent are applicable.

Bio: Ş. İlker Birbil received his PhD degree from North Carolina State University, Raleigh, USA. He then worked for two years as a postdoctoral research fellow in the Netherlands. His research interests include large-scale local and global nonlinear optimization with particular emphasis on algorithm development. He is an associate editor of Journal of Industrial and Management Optimization (JIMO). He is also the cofounder of the website www.bolbilim.com, where he writes mostly about his academic life in Turkey. Currently, he is working as a faculty member in Sabancı University, Istanbul, where he teaches various courses on mathematical programming.

magnifiercross linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram