Please visit Jefferson Lab Event Policies and Guidance before planning your next event: https://www.jlab.org/conference_planning.

May 8 – 12, 2023
Norfolk Waterside Marriott
US/Eastern timezone

Transformers for Generalized Fast Shower Simulation

May 11, 2023, 3:00 PM
15m
Marriott Ballroom I (Norfolk Waterside Marriott)

Marriott Ballroom I

Norfolk Waterside Marriott

235 East Main Street Norfolk, VA 23510
Oral Track 9 - Artificial Intelligence and Machine Learning Track 3+9 Crossover

Speaker

Raikwar, Piyush (CERN)

Description

Recently, transformers have proven to be a generalized architecture for various data modalities, i.e., ranging from text (BERT, GPT3), time series (PatchTST) to images (ViT) and even a combination of them (Dall-E 2, OpenAI Whisper). Additionally, when given enough data, transformers can learn better representations than other deep learning models thanks to the absence of inductive bias, better modeling of long-range dependencies, and interpolation and extrapolation capabilities. Therefore, the transformer is a promising model to be explored for fast shower simulation, where the goal is to generate synthetic particle showers, i.e., the energy depositions in the calorimeter. The transformer should accurately model the non-trivial structure of particle showers, as well as quickly adapt to new detector geometries. Furthermore, the attention mechanism in transformers enables the model to better learn the complex conditional distribution of energy depositions in the detector. In this work, we will present how transformers can be used for accurate and fast shower simulation, as well as the know-how on transformer architecture, input data representation, sequence formation, and learning mechanism.

Consider for long presentation Yes

Primary authors

Presentation materials

Peer reviewing

Paper