This portal has been archived. Explore the next generation of this technology.

A lattice-based approach to the expressivity of deep ReLU neural networks

lib:14183bd2c202519d (v1.0.0)

Authors: Vincent Corlay,Joseph J. Boutros,Philippe Ciblat,Loic Brunel
ArXiv: 1902.11294
Document:  PDF  DOI 
Abstract URL: https://arxiv.org/abs/1902.11294v2


We present new families of continuous piecewise linear (CPWL) functions in Rn having a number of affine pieces growing exponentially in $n$. We show that these functions can be seen as the high-dimensional generalization of the triangle wave function used by Telgarsky in 2016. We prove that they can be computed by ReLU networks with quadratic depth and linear width in the space dimension. We also investigate the approximation error of one of these functions by shallower networks and prove a separation result. The main difference between our functions and other constructions is their practical interest: they arise in the scope of channel coding. Hence, computing such functions amounts to performing a decoding operation.

Relevant initiatives  

Related knowledge about this paper Reproduced results (crowd-benchmarking and competitions) Artifact and reproducibility checklists Common formats for research projects and shared artifacts Reproducibility initiatives

Comments  

Please log in to add your comments!
If you notice any inapropriate content that should not be here, please report us as soon as possible and we will try to remove it within 48 hours!