|
This article is available for purchase or by subscription. See below.
Abstract
|
|
We study the expressive power of deep polynomial neural networks through the
geometry of their neurovarieties. We introduce the notion of the activation degree
threshold of a network architecture to express when the dimension of the
neurovariety achieves its theoretical maximum. We prove the existence of the
activation degree threshold for all polynomial neural networks without width-one
bottlenecks and demonstrate a universal upper bound that is quadratic in the width
of largest size. In doing so, we prove the high activation degree conjecture of Kileel,
Trager, and Bruna. Certain structured architectures have exceptional activation
degree thresholds, making them especially expressive in the sense of their
neurovariety dimension. In this direction, we prove that polynomial neural networks
with equiwidth architectures are maximally expressive by showing their activation
degree threshold is one.
|
PDF Access Denied
We have not been able to recognize your IP address
216.73.216.30
as that of a subscriber to this journal.
Online access to the content of recent issues is by
subscription, or purchase of single articles.
Please contact your institution's librarian suggesting a subscription, for example by using our
journal-recommendation form.
Or, visit our
subscription page
for instructions on purchasing a subscription.
You may also contact us at
contact@msp.org
or by using our
contact form.
Or, you may purchase this single article for
USD 40.00:
Keywords
algebraic geometry, machine learning, neural networks
|
Mathematical Subject Classification
Primary: 14Q30, 68T07
|
Milestones
Received: 13 March 2025
Revised: 28 July 2025
Accepted: 3 August 2025
Published: 28 September 2025
|
| © 2025 MSP (Mathematical Sciences
Publishers). |
|