The high-energy physics community is investigating the feasibility of deploying more machine-learning-based solutions on FPGAs to meet modern physics experiments' sensitivity and latency demands. In this contribution, we introduce a novel end-to-end procedure that utilises a forgotten method in machine learning, i.e. symbolic regression (SR). It searches equation space to discover algebraic relations approximating a dataset. We use PySR (software for uncovering these expressions based on evolutionary algorithms) and extend the functionality of hls4ml (a package for machine learning inference in FPGAs) to support PySR-generated expressions for resource-constrained production environments. Deep learning models often optimise the top metric by pinning the network size because vast hyperparameter space prevents extensive neural architecture search. Conversely, SR selects a set of models on the Pareto front, which allows for optimising the performance-resource tradeoff directly. By embedding symbolic forms, our implementation can dramatically reduce the computational resources needed to perform critical tasks. We validate our procedure on multiple physics benchmarks as an alternative to deep learning and decision tree models.
|Consider for long presentation||Yes|