On the Number of Linear Regions of Convolutional Neural Networks With Piecewise Linear Activations

Document Type

Article

Publication Title

IEEE Transactions on Pattern Analysis and Machine Intelligence

Abstract

One fundamental problem in deep learning is understanding the excellent performance of deep Neural Networks (NNs) in practice. An explanation for the superiority of NNs is that they can realize a large family of complicated functions, i.e., they have powerful expressivity. The expressivity of a Neural Network with Piecewise Linear activations (PLNN) can be quantified by the maximal number of linear regions it can separate its input space into. In this paper, we provide several mathematical results needed for studying the linear regions of Convolutional Neural Networks with Piecewise Linear activations (PLCNNs), and use them to derive the maximal and average numbers of linear regions for one-layer PLCNNs. Furthermore, we obtain upper and lower bounds for the number of linear regions of multi-layer PLCNNs. Our results suggest that deeper PLCNNs have more powerful expressivity than shallow PLCNNs, while PLCNNs have more expressivity than fully-connected PLNNs per parameter, in terms of the number of linear regions.

First Page

5131

Last Page

5148

DOI

10.1109/TPAMI.2024.3361155

Publication Date

7-1-2024

Keywords

Convolutional neural network, expressivity, hyperplane arrangement, linear region, piecewise linear

This document is currently not available here.

Share

COinS