Download this article
 Download this article For screen
For printing
Recent Issues

Volume 17, 1 issue

Volume 16, 5 issues

Volume 15, 5 issues

Volume 14, 5 issues

Volume 13, 5 issues

Volume 12, 8 issues

Volume 11, 5 issues

Volume 10, 5 issues

Volume 9, 5 issues

Volume 8, 5 issues

Volume 7, 6 issues

Volume 6, 4 issues

Volume 5, 4 issues

Volume 4, 4 issues

Volume 3, 4 issues

Volume 2, 5 issues

Volume 1, 2 issues

The Journal
About the Journal
Editorial Board
Editors’ Interests
Subscriptions
 
Submission Guidelines
Submission Form
Policies for Authors
Ethics Statement
 
ISSN: 1944-4184 (e-only)
ISSN: 1944-4176 (print)
Author Index
Coming Soon
 
Other MSP Journals
This article is available for purchase or by subscription. See below.
Learning a performance metric of Buchberger's algorithm

Jelena Mojsilović, Dylan Peifer and Sonja Petrović

Vol. 16 (2023), No. 2, 227–248
Abstract

What can be (machine) learned about the performance of Buchberger’s algorithm?

Given a system of polynomials, Buchberger’s algorithm computes a Gröbner basis of the ideal these polynomials generate using an iterative procedure based on multivariate long division. The runtime of each step of the algorithm is typically dominated by a series of polynomial additions, and the total number of these additions is a hardware-independent performance metric that is often used to evaluate and optimize various implementation choices. We attempt to predict, using just the starting input, the number of polynomial additions that take place during one run of Buchberger’s algorithm. Good predictions are useful for quickly estimating difficulty and understanding what features make a Gröbner basis computation hard. Our features and methods could also be used for value models in the reinforcement learning approach to optimize Buchberger’s algorithm introduced in Peifer’s thesis.

We show that a multiple linear regression model built from a set of easy-to-compute ideal generator statistics can predict the number of polynomial additions somewhat well, better than an uninformed model, and better than regression models built on some intuitive commutative algebra invariants that are more difficult to compute. We also train a simple recursive neural network that outperforms these linear models. Our work serves as a proof of concept, demonstrating that predicting the number of polynomial additions in Buchberger’s algorithm is a feasible problem from the point of view of machine learning.

PDF Access Denied

We have not been able to recognize your IP address 18.224.215.101 as that of a subscriber to this journal.
Online access to the content of recent issues is by subscription, or purchase of single articles.

Please contact your institution's librarian suggesting a subscription, for example by using our journal-recom­mendation form. Or, visit our subscription page for instructions on purchasing a subscription.

You may also contact us at contact@msp.org
or by using our contact form.

Or, you may purchase this single article for USD 30.00:

Keywords
Groebner basis, machine learning
Mathematical Subject Classification
Primary: 13P10, 14Q20
Secondary: 62J20
Milestones
Received: 23 June 2021
Revised: 31 May 2022
Accepted: 12 June 2022
Published: 26 May 2023

Communicated by Kenneth S. Berenhaut
Authors
Jelena Mojsilović
Department of Mathematics
Purdue University
West Lafayette, IN
United States
Dylan Peifer
Department of Mathematics
Cornell University
Ithaca, NY
United States
Sonja Petrović
Department of Applied Mathematics
Illinois Institute of Technology
Chicago, IL
United States