Authors: Ross Anderson,Joey Huchette,Christian Tjandraatmadja,Juan Pablo Vielma
ArXiv: 1811.08359
Document:
PDF
DOI
Abstract URL: http://arxiv.org/abs/1811.08359v2
We present an ideal mixed-integer programming (MIP) formulation for a
rectified linear unit (ReLU) appearing in a trained neural network. Our
formulation requires a single binary variable and no additional continuous
variables beyond the input and output variables of the ReLU. We contrast it
with an ideal "extended" formulation with a linear number of additional
continuous variables, derived through standard techniques. An apparent drawback
of our formulation is that it requires an exponential number of inequality
constraints, but we provide a routine to separate the inequalities in linear
time. We also prove that these exponentially-many constraints are
facet-defining under mild conditions. Finally, we study network verification
problems and observe that dynamically separating from the exponential
inequalities 1) is much more computationally efficient and scalable than the
extended formulation, 2) decreases the solve time of a state-of-the-art MIP
solver by a factor of 7 on smaller instances, and 3) nearly matches the dual
bounds of a state-of-the-art MIP solver on harder instances, after just a few
rounds of separation and in orders of magnitude less time.