Authors: Navin Goyal,Abhishek Shetty
ArXiv: 1807.04936
Document:
PDF
DOI
Abstract URL: http://arxiv.org/abs/1807.04936v3
Non-Gaussian component analysis (NGCA) is a problem in multidimensional data
analysis which, since its formulation in 2006, has attracted considerable
attention in statistics and machine learning. In this problem, we have a random
variable $X$ in $n$-dimensional Euclidean space. There is an unknown subspace
$\Gamma$ of the $n$-dimensional Euclidean space such that the orthogonal
projection of $X$ onto $\Gamma$ is standard multidimensional Gaussian and the
orthogonal projection of $X$ onto $\Gamma^{\perp}$, the orthogonal complement
of $\Gamma$, is non-Gaussian, in the sense that all its one-dimensional
marginals are different from the Gaussian in a certain metric defined in terms
of moments. The NGCA problem is to approximate the non-Gaussian subspace
$\Gamma^{\perp}$ given samples of $X$.
Vectors in $\Gamma^{\perp}$ correspond to `interesting' directions, whereas
vectors in $\Gamma$ correspond to the directions where data is very noisy. The
most interesting applications of the NGCA model is for the case when the
magnitude of the noise is comparable to that of the true signal, a setting in
which traditional noise reduction techniques such as PCA don't apply directly.
NGCA is also related to dimension reduction and to other data analysis problems
such as ICA. NGCA-like problems have been studied in statistics for a long time
using techniques such as projection pursuit.
We give an algorithm that takes polynomial time in the dimension $n$ and has
an inverse polynomial dependence on the error parameter measuring the angle
distance between the non-Gaussian subspace and the subspace output by the
algorithm. Our algorithm is based on relative entropy as the contrast function
and fits under the projection pursuit framework. The techniques we develop for
analyzing our algorithm maybe of use for other related problems.