Boolformer: Symbolic Regression of Logic Functions with Transformers
Abstract
In this work, we introduce Boolformer, the first Transformer architecture trained to perform end-to-end symbolic regression of Boolean functions. First, we show that it can predict compact formulas for complex functions which were not seen during training, when provided a clean truth table. Then, we demonstrate its ability to find approximate expressions when provided incomplete and noisy observations. We evaluate the Boolformer on a broad set of real-world binary classification datasets, demonstrating its potential as an interpretable alternative to classic machine learning methods. Finally, we apply it to the widespread task of modelling the dynamics of gene regulatory networks. Using a recent benchmark, we show that Boolformer is competitive with state-of-the art genetic algorithms with a speedup of several orders of magnitude. Our code and models are available publicly.
Community
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- A Neural-Guided Dynamic Symbolic Network for Exploring Mathematical Expressions from Data (2023)
- LogicMP: A Neuro-symbolic Approach for Encoding First-order Logic Constraints (2023)
- Improving Length-Generalization in Transformers via Task Hinting (2023)
- Auto-Regressive Next-Token Predictors are Universal Learners (2023)
- PROSE: Predicting Operators and Symbolic Expressions using Multimodal Transformers (2023)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper