Smooth finite-sum optimization has been widely studied in both convex and
nonconvex settings. However, existing lower bounds for finite-sum optimization
are mostly limited to the setting where each component function is (strongly)
convex, while the lower bounds for nonconvex finite-sum optimization remain
largely unsolved. In this paper, we study the lower bounds for smooth nonconvex
finite-sum optimization, where the objective function is the average of $n$
nonconvex component functions. We prove tight lower bounds for the complexity
of finding $\epsilon$-suboptimal point and $\epsilon$-approximate stationary
point in different settings, for a wide regime of the smallest eigenvalue of
the Hessian of the objective function (or each component function). Given our
lower bounds, we can show that existing algorithms including KatyushaX
(Allen-Zhu, 2018), Natasha (Allen-Zhu, 2017), RapGrad (Lan and Yang, 2018) and
StagewiseKatyusha (Chen and Yang, 2018) have achieved optimal Incremental
First-order Oracle (IFO) complexity (i.e., number of IFO calls) up to logarithm
factors for nonconvex finite-sum optimization. We also point out potential ways
to further improve these complexity results, in terms of making stronger
assumptions or by a different convergence analysis.