Parallel stochastic gradient algorithms for large-scale matrix completion
This paper develops Jellyfish , an algorithm for solving data-processing problems with matrix-valued decision variables regularized to have low rank. Particular examples of problems solvable by Jellyfish include matrix completion problems and least-squares problems regularized by the nuclear norm or...
Saved in:
Published in: | Mathematical programming computation 2013-06, Vol.5 (2), p.201-226 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | eng |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper develops
Jellyfish
, an algorithm for solving data-processing problems with matrix-valued decision variables regularized to have low rank. Particular examples of problems solvable by
Jellyfish
include matrix completion problems and least-squares problems regularized by the nuclear norm or
-norm.
Jellyfish
implements a projected incremental gradient method with a biased, random ordering of the increments. This biased ordering allows for a parallel implementation that admits a speed-up nearly proportional to the number of processors. On large-scale matrix completion tasks,
Jellyfish
is orders of magnitude more efficient than existing codes. For example, on the Netflix Prize data set, prior art computes rating predictions in approximately 4 h, while
Jellyfish
solves the same problem in under 3 min on a 12 core workstation. |
---|---|
ISSN: | 1867-2949 1867-2957 |