Authors: Ke Ma,Jinshan Zeng,Jiechao Xiong,Qianqian Xu,Xiaochun Cao,Wei Liu,Yuan Yao
ArXiv: 1711.06446
Document:
PDF
DOI
Artifact development version:
GitHub
Abstract URL: http://arxiv.org/abs/1711.06446v2
Learning representation from relative similarity comparisons, often called
ordinal embedding, gains rising attention in recent years. Most of the existing
methods are batch methods designed mainly based on the convex optimization,
say, the projected gradient descent method. However, they are generally
time-consuming due to that the singular value decomposition (SVD) is commonly
adopted during the update, especially when the data size is very large. To
overcome this challenge, we propose a stochastic algorithm called SVRG-SBB,
which has the following features: (a) SVD-free via dropping convexity, with
good scalability by the use of stochastic algorithm, i.e., stochastic variance
reduced gradient (SVRG), and (b) adaptive step size choice via introducing a
new stabilized Barzilai-Borwein (SBB) method as the original version for convex
problems might fail for the considered stochastic \textit{non-convex}
optimization problem. Moreover, we show that the proposed algorithm converges
to a stationary point at a rate $\mathcal{O}(\frac{1}{T})$ in our setting,
where $T$ is the number of total iterations. Numerous simulations and
real-world data experiments are conducted to show the effectiveness of the
proposed algorithm via comparing with the state-of-the-art methods,
particularly, much lower computational cost with good prediction performance.