Conveners
Track 3+9 Crossover: Simulation
- Marilena Bandieramonte (University of Pittsburgh)
- SOFIA VALLECORSA (CERN)
Track 3+9 Crossover: FastSim
- Sandro Wenzel (CERN)
- Marilena Bandieramonte (University of Pittsburgh)
IDEA (Innovative Detector for an Electron-positron Accelerator) is an innovative general-purpose detector concept, designed to study electron-positron collisions at future e$^+$e$^-$ circular colliders (FCC-ee and CEPC).
The detector will be equipped with a dual read-out calorimeter able to measure separately the hadronic component and the electromagnetic component of the showers initiated...
PARSIFAL (PARametrized SImulation) is a software tool that can reproduce the complete response of both triple-GEM and micro-RWELL based trackers. It takes into account the involved physical processes by their simple parametrization and thus in a very fast way. Existing software as GARFIELD++ are robust and reliable, but very CPU time consuming. The implementation of PARSIFAL was driven by the...
AtlFast3 is the new, high-precision fast simulation in ATLAS that was deployed by the collaboration to replace AtlFastII, the fast simulation tool that was successfully used for most of Run2. AtlFast3 combines a parametrization-based Fast Calorimeter Simulation and a new machine-learning-based Fast Calorimeter Simulation based on Generative Adversarial Networks (GANs). The new fast simulation...
The Large Field Low-energy X-ray Polarization Detector (LPD) is a gas photoelectric effect polarization detector designed for the detailed study of X-ray temporary sources in high-energy astrophysics. Previous studies have shown that the polarization degree of gamma ray bursts (GRBs) is generally low or unpolarized. Considering the spatial background and other interferences, We need high...
Detailed detector simulation is the major consumer of CPU resources at LHCb, having used more than 80% of the total computing budget during Run 2 of the Large Hadron Collider at CERN. As data is collected by the upgraded LHCb detector during Run 3 of the LHC, larger requests for simulated data samples are necessary, and will far exceed the pledged resources of the experiment, even with...
Modern high energy physics experiments fundamentally rely on accurate simulation- both to characterise detectors and to connect observed signals to underlying theory. Traditional simulation tools are reliant upon Monte Carlo methods which, while powerful, consume significant computational resources. These computing pressures are projected to become a major bottleneck at the high luminosity...
Reliably simulating detector response to hadrons is crucial for almost all physics programs at the Large Hadron Collider. The core component of such simulation is the modeling of hadronic interactions. Unfortunately, there is no first-principle theory guidance. The current state-of-the-art simulation tool, Geant4, exploits phenomenology-inspired parametric models, each simulating a specific...
The full simulation of particle colliders incurs a significant computational cost. Among the most resource-intensive steps are detector simulations. It is expected that future developments, such as higher collider luminosities and highly granular calorimeters, will increase the computational resource requirement for simulation beyond availability. One possible solution is generative neural...
For High Energy Physics (HEP) experiments, the calorimeter is a key detector to measure the energy of particles. Particles interact with the material of the calorimeter, creating cascades of secondary particles, the so-called showers. Description of the showering process relies on simulation methods that precisely describe all particle interactions with matter. Constrained by the complexity of...
Simulation is a crucial part of all aspects of collider data analysis. However, the computing challenges of the High Luminosity era will require simulation to use a smaller fraction of computing resources, at the same time as more complex detectors are introduced, requiring more detailed simulation. This motivates the use of machine learning (ML) models as surrogates to replace full...
Recently, transformers have proven to be a generalized architecture for various data modalities, i.e., ranging from text (BERT, GPT3), time series (PatchTST) to images (ViT) and even a combination of them (Dall-E 2, OpenAI Whisper). Additionally, when given enough data, transformers can learn better representations than other deep learning models thanks to the absence of inductive bias, better...
At the CMS experiment, a growing reliance on the fast Monte Carlo application (FastSim) will accompany the high luminosity and detector granularity expected in Phase 2. The FastSim chain is roughly 10 times faster than the application based on the GEANT4 detector simulation and full reconstruction referred to as FullSim. However, this advantage comes at the price of decreased accuracy in some...