Conveners
Plenary Session: Welcome / Highlighted Keynotes
- Gail Dodge (Olde Dominion University)
Plenary Session: "Challenges in AI/ML" & "Opportunities in ESCAPE"
- Maria Girone (CERN)
Plenary Session: Computing Infrastructure & Future Directions
- Jerome LAURET (Brookhaven Science Associates)
Plenary Session: Streaming R/O & Data Management
- Oxana Smirnova (Lund University)
Plenary Session: Planning for the Future
- Randall Sobie (University of Victoria)
Plenary Session: Diversity & Bias / Inclusive and Ethical Computing
- Gordon Watts (University of Washington)
Plenary Session: Upcoming Computing Challenges beyond NP/HEP
- Torre Wenaus (BNL)
Plenary Session: Track 1--6 Highlights
- Paul Laycock (BNL)
- Oksana Shadura (University Nebraska-Lincoln (US))
Plenary Session: Track 7--X Highlights and Close out
- Raffaella De Vita (INFN - Genova)
- Xavier Espinal (CERN)
In today's Nuclear Physics (NP), the exploration of the origin, evolution, and structure of the universe's matter is pursued through a broad research program at various collaborative scales, ranging from small groups to large experiments comparable in size to those in high-energy physics (HEP). Consequently, software and computing efforts vary from DIY approaches among a few researchers to...
Instead of focusing on the concrete challenges of incremental changes to HEP driven by AI/ML, it is perhaps a useful exercise to think through more radical, speculative changes. What might be enabled if we embraced a dramatically different approach? What would we lose? How would those changes impact the computational, organizational, and epistemological nature of the field?
Simulation is a critical component of high energy physics research, with a corresponding computing footprint. Generative AI has emerged as a promising complement to intensive full simulation with relatively high accuracy compared to existing classical fast simulation alternatives. Such algorithms are naturally suited to acceleration on coprocessors, potentially running fast enough to match the...
A [Dark Matter Science Project][1] is being developed in the context of the [ESCAPE project][2] as a collaboration between scientists in European Research Infrastructures and experiments seeking to explain the nature of dark matter (such as HL-LHC, KM3NeT, CTA, DarkSide).
The goal of this ESCAPE Science Project is to highlight the synergies between different dark matter communities and...
The EU-funded ESCAPE project has brought together the ESFRI and other world class Research Infrastructures in High Energy and Nuclear Physics, Astro-Particle Physics, and Astronomy. In the 3 years of the project many synergistic and collaborative aspects have been highlighted and explored, from pure technical collaboration on common solutions for data management, AAI, and workflows, through...
One of the objectives of the EOSC (European Open Science Cloud) Future Project is to integrate diverse analysis workflows from Cosmology, Astrophysics and High Energy Physics in a common framework. The project’s development relies on the implementation of the Virtual Research Environment (VRE), a prototype platform supporting the goals of Dark Matter and Extreme Universe Science Projects in...
The large data volumes expected from the High Luminosity LHC (HL-LHC) present challenges to existing paradigms and facilities for end-user data analysis. Modern cyberinfrastructure tools provide a diverse set of services that can be composed into a system that provides physicists with powerful tools that give them straightforward access to large computing resources, with low barriers to entry....
ALICE has upgraded many of its detectors for LHC Run 3 to operate in continuous readout mode recording Pb-Pb collisions at 50 kHz interaction rate without trigger.
This results in the need to process data in real time at rates 50 times higher than during Run 2. In order to tackle such a challenge we introduced O2, a new computing system and the associated infrastructure. Designed and...
Streaming Readout Data Acquisition systems coupled with distributed resources spread over vast geographic distances present new challenges to the next generation of experiments. High bandwidth modern network connectivity opens the possibility to utilize large, general-use, HTC systems that are not necessarily located close to the experiment. Near real-time response rates and workflow...
As nuclear physics collaborations and experiments increase in size, the data management and software practices in this community have changed as well. Large nuclear physics experiments at Brookhaven National Lab (STAR, PHENIX, sPHENIX), at Jefferson Lab (GlueX, CLAS12, MOLLER), and at the Electron-Ion Collider (ePIC) are taking different approaches to data management, building on existing...
This presentation will cover the content of the report delivered by the Snowmass computational Frontier late in 2022. A description of the frontier organization and various preparatory events, including the Seattle Community Summer Study (CSS), will be followed by a discussion on the evolution of computing hardware and the impact of newly established and emerging technologies, including...
The U.S. Nuclear Physics community has been conducting long-range planning (LRP) for nuclear science since late 1970s. The process is known as the Nuclear Science Advisory Committee (NSAC) LRP with NSAC being an advisory body jointly appointed by the U.S. Department of Energy and the U.S. National Science Foundation. The last NSAC LRP was completed in 2015 and the current NSAC LRP is ongoing...
Today's students are tomorrow's leaders in science, technology, engineering and math. To ensure the best minds reach their potential tomorrow, it's vital to ensure that students not only experience meaningful STEM learning today, but also have the opportunities and support to pursue careers in a STEM environment that is more welcoming, inclusive and just. This panel will feature expertise from...
The WLCG infrastructure provides the compute power and the storage capacity for the computing needs of the experiments at the Large Hadron Collider (LHC) at CERN. The infrastructure is distributed across over 170 data centers in more than 40 countries. The amount of consumed energy in WLCG to support the scientific program of the LHC experiments and its evolution depends on different factors:...
The nature and origin of dark matter are among the most compelling mysteries of contemporary science. There is strong evidence for dark matter from its role in shaping the galaxies and galaxy clusters that we observe in the universe. Still, physicists have tried to detect dark matter particles for over three decades with little success.
This talk will describe the leading effort in that...
The increasing volumes of data produced at light sources such as the Linac Coherent Light Source (LCLS) enable the direct observation of materials and molecular assemblies at the length and timescales of molecular and atomic motion. This exponential increase in the scale and speed of data production is prohibitive to traditional analysis workflows that rely on scientists tuning parameters...
The astronomy world is moving towards exascale observatories and experiments, with data distribution and access challenges that are comparable with - and on similar timescales to - those of HL-LHC. I will present some of these use cases and show progress towards prototyping work in the Square Kilometre Array (SKA) Regional Centre Network, and present the conceptual architectural view of the...